HIPAA compliance is one of the most misunderstood topics in small business IT. Healthcare providers, dental offices, mental health practices, billing companies, and their technology vendors all operate under HIPAA — and most of them have a fuzzy understanding of what it actually requires.
This isn't legal advice. HIPAA compliance involves administrative and physical safeguards that go beyond IT, and your specific situation requires guidance from a qualified compliance consultant or attorney. What this is: a plain-English breakdown of the IT-specific requirements, so you can have an informed conversation with your IT provider about whether your setup is actually compliant.
Who HIPAA Applies To
HIPAA's Security Rule applies to two categories of organizations:
Covered Entities: Healthcare providers (doctors, dentists, therapists, hospitals, clinics), health plans (insurers, Medicare, Medicaid), and healthcare clearinghouses. If you provide healthcare or bill for it, you're a covered entity.
Business Associates: Any organization that handles Protected Health Information (PHI) on behalf of a covered entity. This includes IT companies, billing services, cloud storage providers, email platforms, transcription services, and many others. If your IT provider has access to PHI — which they almost certainly do if they manage systems that contain patient data — they are a Business Associate and must sign a Business Associate Agreement (BAA).
If your IT provider has access to systems containing patient data and has never presented you with a Business Associate Agreement, that is a HIPAA violation — right now, regardless of anything else about your setup. A BAA is not optional.
The Technical Safeguards: What HIPAA Actually Requires
HIPAA's Security Rule specifies "technical safeguards" — the technology controls required to protect electronic Protected Health Information (ePHI). Here's what each requirement means in practice:
Access Controls
Every user must have a unique login. No shared accounts. No generic "front desk" logins that multiple staff members use. Each person who accesses systems containing patient data needs their own username and password, and access should be limited to only what each role needs to do their job.
In practice: unique Active Directory or Google Workspace accounts for every employee, with permissions set by role. A front desk coordinator shouldn't have the same system access as a billing manager or clinical staff.
Audit Controls
Your systems must log who accessed what, when. If a patient calls to say their records were inappropriately accessed, you need to be able to pull logs showing exactly who viewed their file and when.
In practice: audit logging enabled in your EHR (Electronic Health Record) system, email platform, and any other system containing ePHI. Most HIPAA-compliant EHR platforms have this built in — the gap is usually in peripheral systems like email or cloud storage.
Integrity Controls
ePHI must be protected from unauthorized alteration or destruction. This means controls that ensure patient data isn't modified or deleted without authorization and that you'd know if it was.
In practice: version history in document systems, backup systems that maintain historical copies, and access controls that prevent unauthorized deletions.
Transmission Security
ePHI transmitted over a network must be encrypted. Emailing unencrypted patient data is a HIPAA violation. Sending PHI over standard SMS is a violation. Patient data in transit must be protected.
In practice: encrypted email (either a HIPAA-compliant email platform or a secure messaging add-on), encrypted file transfer for sharing records, HTTPS for any web-based patient portals.
The Things That Aren't In HIPAA But Should Be In Your Setup Anyway
HIPAA's technical requirements are broad and principle-based rather than prescriptive — they say what outcomes to achieve, not exactly how to achieve them. The following aren't explicitly mandated by a specific HIPAA clause but are considered standard practice for a compliant environment and would be expected in an audit:
- Multi-factor authentication (MFA) on all systems accessing ePHI. Not explicitly required by name in HIPAA, but considered an addressable implementation specification under access controls — meaning you need a documented reason for not having it.
- Full disk encryption on all laptops and portable devices. A stolen laptop with unencrypted patient data is a reportable breach. A stolen laptop with encrypted data is not, as long as the encryption key wasn't also compromised.
- Endpoint Detection and Response (EDR) on devices that access ePHI. HIPAA requires protection against malware — traditional antivirus is increasingly insufficient.
- Tested, encrypted backups stored separately from production systems. Ransomware that encrypts your backup alongside your data is a disaster and a breach. Backups should be isolated and tested regularly.
- Security Awareness Training (SAT) for all staff. Phishing is the most common initial access vector in healthcare breaches. HIPAA's workforce security requirements implicitly require training.
The Business Associate Agreement (BAA): What It Is and Why It Matters
A BAA is a contract between a covered entity and a business associate that establishes the business associate's responsibility to protect ePHI. Every vendor with access to patient data must have a signed BAA on file.
This includes:
- Your IT managed service provider
- Your email platform (Microsoft, Google, or others must offer a BAA — Microsoft 365 and Google Workspace both have HIPAA BAAs available on qualifying plans)
- Your cloud storage provider (OneDrive, Google Drive, Dropbox Business — check whether your plan includes a BAA)
- Your EHR software vendor
- Any telehealth platform
- Your billing service if outsourced
- Any transcription, answering service, or other vendor who handles patient data
Missing BAAs are one of the most common findings in HIPAA audits — and one of the easiest to fix. Create a list of every vendor who touches patient data and verify you have a signed BAA for each one.
Common HIPAA IT Violations (That Are Surprisingly Common)
Texting patient information from personal phones. Standard SMS is not encrypted and is not HIPAA-compliant. Staff who text appointment reminders or clinical information to patients using personal phones are creating violations.
Emailing unencrypted PHI. Forwarding a patient file from your work email to your personal Gmail to work from home is a violation. Emailing test results or records without encryption is a violation.
Using personal cloud storage. Saving patient records to a personal Google Drive or Dropbox account — even "just for convenience" — is a violation if those accounts don't have a BAA in place.
Leaving workstations unlocked and unattended. A workstation with a patient record visible on screen in a waiting area is a physical safeguard failure with a technical component — automatic screen lock after inactivity is a standard control.
No MFA on remote access. Remote access to systems containing ePHI — VPN, remote desktop, cloud applications — without MFA is increasingly treated as a de facto violation given the current threat environment.
What a HIPAA-Compliant IT Setup Looks Like
For a typical small healthcare practice, a compliant IT environment generally includes:
- Unique user accounts for every employee, with role-based permissions
- MFA enforced on all cloud applications and remote access
- Encrypted email platform with a BAA (M365 or Google Workspace Business/Enterprise plans)
- Full disk encryption on all laptops and desktops
- EDR or MDR solution on all devices
- Encrypted, tested backups — isolated from production systems
- Automatic screen lock after inactivity (typically 5–15 minutes)
- Security Awareness Training for all staff, documented annually
- Signed BAAs with every vendor who handles ePHI
- A documented Security Risk Assessment (required by HIPAA — not optional)
- A written incident response plan
The Security Risk Assessment is worth emphasizing: HIPAA explicitly requires covered entities to conduct a thorough assessment of the potential risks and vulnerabilities to ePHI. This is a documented process, not a checkbox — and it's required to be updated regularly or when the environment changes significantly.
The Breach Notification Piece
If you do have a breach — unauthorized access to, disclosure of, or acquisition of ePHI — HIPAA requires specific notification steps:
- Notify affected individuals within 60 days of discovery
- If the breach affects 500+ individuals in a state, notify prominent media outlets in that state
- Report to the Department of Health and Human Services (HHS) — breaches of 500+ individuals must be reported within 60 days; smaller breaches are reported annually
Your cyber insurance carrier should be in the loop immediately as well — many policies include breach response services that can manage notifications and forensics.
Working With Your IT Provider on HIPAA
A good IT provider serving healthcare clients should proactively understand HIPAA requirements and configure your environment accordingly. They should have a signed BAA with you. They should be able to tell you, specifically, what controls are in place that address each HIPAA technical safeguard requirement.
If your IT provider says they "handle HIPAA" but can't walk you through the specific controls, that's worth pressing on. "We're HIPAA-compliant" and "your environment meets HIPAA's technical safeguard requirements" are different claims. The latter requires specifics.