Posted on
Feb 9, 2025
Posted on
May 13, 2026
Learn how true AI scribe Epic EHR integration works via Smart-on-FHIR and NoteWriter—not clipboard workarounds. A technical guide for CMIOs & IT Directors.
AI Scribe for Epic EHR: Integration & Workflow Guide
TL;DR: Most AI scribe "Epic integrations" file notes to Chart Review or open a paste window—forcing 20+ manual clicks. True NoteWriter integration requires an encounter-scoped Smart-on-FHIR launch with launch/patient, launch/encounter, a valid NoteTypeID, and notes-write capability. Scribing.io's app pushes editable drafts directly into NoteWriter with SmartData Element (SDE) prefill, eliminating copy-paste and enabling discrete data capture for coding and prior authorization. See Scribing.io Pricing.
Why NoteWriter Integration Matters: The 20-Click Problem CMIOs Must Solve
Information Gain: Why Vanilla FHIR Misses NoteWriter and What the Correct Architecture Requires
Scribing.io Clinical Logic: Handling an Orthopedic Surgeon's ACL Documentation
Technical Reference: ICD-10 Documentation Standards for ACL Injuries
Step-by-Step: The Encounter-Scoped Launch Sequence
All-Party Consent State Compliance Framework
CMIO Evaluation Criteria: Separating Real Integration from Marketing Claims
Book a Technical Demo: Live NoteWriter Push in Epic Sandbox
Why NoteWriter Integration Matters: The 20-Click Problem CMIOs Must Solve
Every Chief Medical Information Officer evaluating ambient AI scribes confronts an identical hidden failure: the gap between "integrated with Epic" and "actually inside the clinician's note workflow." Scribing.io exists to close that gap at the API layer—not with a better clipboard, but by eliminating clipboard dependency entirely.
The AMA's 2024 physician practice benchmarks confirm that documentation burden remains the primary driver of burnout, with clinicians spending an average of 16 minutes per encounter on EHR documentation tasks. A significant portion of that time—often 4 to 6 minutes—is consumed by navigation clicks, copy-paste maneuvers, and manual field population that a properly scoped integration eliminates entirely. For organizations exploring cross-platform approaches, our EHR Compatibility guide maps the technical differences across major systems.
The competitive landscape in 2026 reveals a pattern that CMIOs must interrogate directly during vendor evaluation. Marketing materials universally claim "bi-directional EHR integration" without specifying where the generated note lands inside Epic. The distinction is architecturally significant:
Epic Note Delivery: Where AI Scribe Output Actually Lands | ||||
Integration Method | Where Note Appears | Clinician Action Required | Click Overhead | Discrete Data Captured? |
|---|---|---|---|---|
Clipboard / Paste Window | External buffer → manual paste into open note | Open NoteWriter → position cursor → paste → reformat → sign | ~20–25 clicks | No |
FHIR DocumentReference.create (vanilla) | Chart Review / Media tab / unattached document | Locate document → open → copy → navigate to NoteWriter → paste → sign | ~18–22 clicks | No |
Encounter-scoped Smart-on-FHIR + notes-write + NoteTypeID | NoteWriter as editable draft for active encounter | Review → cosign (if routed) | 2–3 clicks | Yes (via SDE mapping) |
The bottom row is what Scribing.io's Epic EHR Integration delivers. The top two rows represent what most competitors—including those with high satisfaction scores on general usability—actually provide when you examine their Epic data flow at the API level.
Information Gain: Why Vanilla FHIR Misses NoteWriter and What the Correct Architecture Requires
Most "Epic integrations" are glorified copy-paste. To actually remove approximately 20 clicks, the note must be inserted as an editable draft inside Epic's NoteWriter for the active encounter. Competitors overlook a critical technical reality: a vanilla FHIR DocumentReference.create call typically files the document to Chart Review (or as an unattached document) and does not surface in NoteWriter where the clinician is actively working.
The ONC's SMART Health IT framework defines the authorization model, but Epic's proprietary implementation requires specific capabilities beyond the base spec. Understanding this distinction separates genuine integration from vendor theater.
The Correct Technical Path
The architecture that achieves true NoteWriter insertion requires all of the following components working in concert:
Encounter-scoped Smart-on-FHIR launch — The app must be launched within the context of a specific patient encounter. This requires OAuth2 scopes including
launch/patientandlaunch/encounter, which pass the active encounter's FHIR ID to the application at runtime. Withoutlaunch/encounter, the app has no target context for note placement.Epic's notes-write capability — Beyond standard FHIR write operations, the app must invoke Epic's proprietary notes-write endpoint (or the appropriate R4 equivalent configured for NoteWriter delivery). This is distinct from generic document creation. The Epic on FHIR documentation specifies this as a separate capability requiring explicit provisioning during App Orchard review.
Valid NoteTypeID — Every Epic organization defines note types (Progress Note, H&P, Procedure Note, Orthopedic Progress Note, etc.) with specific identifiers. The AI scribe must pass the correct
NoteTypeIDcorresponding to the encounter's context. Missing or invalid NoteTypeIDs cause Epic to reject the note or file it to a default location outside NoteWriter.Author and encounter identifiers — The note must carry the authenticated provider's identifier and the encounter reference to associate it correctly with the visit timeline.
SmartData Element (SDE) mapping — To populate structured examination fields (ROS, Physical Exam, Assessment), the app maps extracted clinical findings to Epic's SmartData Elements. This enables discrete data capture that downstream systems—coding engines, clinical decision support, payer portals—can consume without NLP re-processing of narrative text.
What Competitors Miss
The competitor landscape broadly falls into two categories:
Category A (Paste-window vendors): The app generates text and presents it in a sidebar or overlay. The MA or clinician manually pastes it into an open note. No API-level note creation occurs. These vendors may integrate with athenahealth API or other platforms differently, but their Epic workflow remains clipboard-dependent.
Category B (DocumentReference vendors): The app uses standard FHIR APIs to create a document. Without encounter-scoping and NoteTypeID, Epic files it to Chart Review. The clinician must navigate away from their workflow to find, open, and copy the content.
Neither category achieves what CMIOs actually need: a zero-friction path from ambient capture to editable, cosign-ready NoteWriter draft with discrete data in SDE fields.
Scribing.io Clinical Logic: Handling an Orthopedic Surgeon's ACL Documentation in an All-Party Consent State
The Scenario
An orthopedic surgeon in an all-party consent state documents a new knee injury. Their current "Epic integration" only opens a paste window, so the MA pastes a free-text transcript after the visit. The payer cannot find discrete Lachman grading in NoteWriter's exam SDEs and denies prior authorization for ACL reconstruction, delaying surgery and risking a $6,800 loss.
The Clinical Failure Chain
Documentation Failure: Paste-Window Integration vs. Scribing.io NoteWriter Integration | ||
Step | Paste-Window Workflow (Current State) | Scribing.io NoteWriter Workflow |
|---|---|---|
1. Encounter consent | All-party consent obtained; recording begins | All-party consent obtained; recording begins |
2. Surgeon dictation | "Lachman grade 2, 6 mm translation" captured as audio | "Lachman grade 2, 6 mm translation" captured as audio |
3. AI transcription + NLP | Free-text paragraph generated in external app | Clinical NLP extracts: Lachman = Grade 2; anterior tibial translation = 6 mm; laterality = right |
4. EHR delivery | MA opens paste window → copies text → pastes into note body | Smart-on-FHIR app (encounter-scoped launch) maps findings to Knee Exam SDEs and pushes Orthopedic Progress Note (NoteTypeID validated) directly into NoteWriter |
5. Note structure | Unstructured free-text blob; no discrete SDE values populated | Discrete fields populated: Lachman Grade = 2, Translation = 6 mm; narrative note also present as editable draft |
6. Cosign routing | None (MA-pasted note lacks routing logic) | Auto-routed to surgeon for cosign per practice policy |
7. Payer prior auth review | Payer's automated system searches for discrete exam findings → finds none → denies authorization | Payer's automated system queries discrete SDEs → confirms Grade 2 Lachman with measurable translation → approves same day |
8. Financial outcome | Delayed surgery; appeal process; potential $6,800 revenue loss | Same-day approval; surgery scheduled within standard timeline; revenue preserved |
9. Click burden | ~20–25 clicks (open note, paste, scroll, reformat, sign) | 2–3 clicks (review draft, cosign) |
Granular Step-by-Step: How Scribing.io Solves This Problem
The following logic breakdown traces exactly how Scribing.io's Smart-on-FHIR architecture converts dictated clinical findings into a payer-approved, discrete-data-rich NoteWriter entry:
Pre-encounter: Consent state detection. Scribing.io's configuration layer identifies the practice's jurisdiction as an all-party consent state. The app surfaces a consent prompt workflow to the MA before recording activates. Consent acknowledgment is timestamped and stored as metadata on the encounter record, satisfying state recording law requirements.
Audio capture and transport. The encounter audio is captured via the clinic's preferred hardware (ambient microphone array or mobile device). Audio streams to Scribing.io's HIPAA-compliant processing layer via TLS 1.3 with AES-256 encryption at rest. No audio persists after transcription unless the organization opts into archival.
Clinical NLP extraction. The transcription engine produces raw text. A second-pass clinical NLP model extracts structured entities:
Exam finding: Lachman test
Grade: 2
Measurement: 6 mm anterior tibial translation
Laterality: right (inferred from encounter context and explicit dictation)
Ligament: anterior cruciate
These entities are mapped to SNOMED CT concepts and then to the organization's specific Epic SDE identifiers via a configuration table maintained during implementation.
Note assembly. The system assembles two parallel outputs:
Narrative note: A properly formatted Orthopedic Progress Note with HPI, ROS, Physical Examination (including the Lachman finding in clinical prose), Assessment, and Plan sections.
Discrete SDE payload: Structured key-value pairs for each mapped SmartData Element—Lachman Grade (2), Anterior Translation (6 mm), Laterality (Right), etc.
Encounter-scoped Smart-on-FHIR launch. The app authenticates via OAuth2 with scopes
launch/patient,launch/encounter,patient/*.read, and the notes-write capability. The launch context returns the active encounter's FHIR ID and the authenticated provider's NPI-linked identifier.NoteTypeID resolution. The app queries the organization's NoteType catalog and selects "Orthopedic Progress Note" (or the site-specific equivalent). If the NoteTypeID cannot be resolved, the system alerts the implementation team rather than defaulting to an incorrect type—this prevents the note from landing in the wrong bucket.
Notes-write API call. The assembled note (narrative + SDE payload) is pushed to Epic's notes-write endpoint with:
Encounter FHIR ID (encounter-scoping)
NoteTypeID (Orthopedic Progress Note)
Author identifier (surgeon's provider record)
Status: Draft (editable by the signing provider)
SDE values: mapped to the Knee Examination section
NoteWriter draft appears. The surgeon sees the note in NoteWriter for their active encounter. The Physical Examination section shows discrete Lachman grading. The narrative sections are editable. The note is flagged for cosign per the practice's attestation policy.
Cosign and close. The surgeon reviews (can edit any field), then cosigns. Total interaction: 2–3 clicks. The signed note with discrete SDE data is immediately available to downstream systems.
Payer query resolution. When the prior authorization request fires, the payer's automated system queries the encounter's discrete examination data. It finds Lachman Grade = 2 with 6 mm translation in structured fields. Medical necessity criteria are met. Authorization is approved same day.
Why This Matters for CMIOs
The failure described above is not hypothetical—it represents a systemic pattern wherever AI scribe output arrives as unstructured text. A JAMA Health Forum analysis of prior authorization denials found that incomplete or non-discrete documentation accounts for a substantial percentage of initial denials in surgical specialties. Payer automation increasingly relies on discrete data fields rather than NLP parsing of free-text notes. When exam findings exist only as narrative prose buried in a pasted paragraph, automated prior authorization queries return null results, triggering denials that require manual appeals costing staff time and delaying patient care.
Technical Reference: ICD-10 Documentation Standards for ACL Injuries
Accurate ICD-10 coding for anterior cruciate ligament injuries depends on documentation specificity that only discrete, structured capture can reliably support. The CMS ICD-10 coding guidelines require laterality, encounter type, and anatomic specificity for musculoskeletal injury codes. The following codes are relevant to the orthopedic scenario above:
Primary Codes
S83.511A — Sprain of anterior cruciate ligament of right knee
Requires documentation of: laterality (right), ligament (ACL), encounter type (initial)
Supporting documentation: mechanism of injury, physical exam findings (Lachman, anterior drawer, pivot shift), imaging correlation
Seventh character "A" designates initial encounter; subsequent visits require "D"; sequela requires "S"
Without discrete laterality capture, coders must interpret narrative text—introducing error risk and audit vulnerability
initial encounter; M25.561 — Pain in right knee
Supporting/secondary code when pain is documented as a distinct clinical concern
Requires laterality specification (right = 1, left = 2)
Commonly paired with S83.511A to capture the symptom driving the encounter alongside the structural diagnosis
How Scribing.io Ensures Maximum Code Specificity
Scribing.io's clinical NLP pipeline is trained to extract the discrete elements that ICD-10 codes require:
ICD-10 Documentation Elements: Discrete Capture vs. Narrative Dependency | ||||
Data Element | Required for Code Assignment | Optimal Capture Method | Paste-Window Outcome | Scribing.io SDE Outcome |
|---|---|---|---|---|
Laterality | Yes (S83.511A vs S83.512A) | Discrete field | Buried in narrative; requires coder interpretation | Mapped to laterality SDE; unambiguous |
Ligament specificity | Yes (ACL vs PCL vs collateral) | Discrete field or structured exam | May be unclear if multiple structures discussed | Mapped to specific ligament exam SDE |
Encounter type (initial/subsequent/sequela) | Yes (7th character: A, D, or S) | Encounter metadata | Often omitted from pasted text | Auto-populated from encounter context |
Lachman grade | Supports medical necessity for surgical intervention | Discrete SDE (Knee Exam) | Free text; payer automation cannot parse | Discrete value: Grade 2 |
Translation measurement (mm) | Supports medical necessity; quantifies instability | Discrete SDE (Knee Exam) | Free text; units may be ambiguous | Discrete value: 6 mm with unit specification |
Mechanism of injury | Supports initial encounter designation | HPI structured field | Present but unstructured | Extracted and mapped to HPI SDE |
The critical insight: ICD-10 code assignment accuracy is only as good as the discrete data available to the coder. When a surgeon dictates "Lachman grade 2, 6 mm translation, right knee" and that information arrives in Epic as an unstructured text blob, the coder must manually interpret and extract each element. This introduces transcription error, slows coding throughput, and creates audit risk. When the same dictation flows through Scribing.io's NLP → SDE mapping pipeline, each element arrives as a discrete, queryable value—laterality in its field, grade in its field, measurement in its field. The coder (or automated coding engine) can assign S83.511A with confidence, and the payer's prior authorization logic can confirm medical necessity without human review.
Step-by-Step: The Encounter-Scoped Launch Sequence
For CMIOs evaluating implementation complexity, the following details the exact technical sequence from Epic Hyperspace action to NoteWriter draft appearance:
Hyperspace Activity Launch: The clinician (or MA) clicks the Scribing.io activity button within the active encounter in Hyperspace. Epic initiates the SMART launch sequence, passing the
launchparameter andiss(FHIR server URL) to Scribing.io's registered redirect URI.OAuth2 Authorization: Scribing.io requests authorization with scopes:
launch launch/patient launch/encounter patient/DocumentReference.write patient/Observation.writeplus the notes-write capability scope. Epic's authorization server validates the app's registration (via App Orchard or private listing) and returns an authorization code.Token Exchange: Scribing.io exchanges the authorization code for an access token. The token response includes the
patientFHIR ID andencounterFHIR ID—this encounter-scoping is what paste-window vendors never receive because they do not launch within encounter context.Encounter Context Validation: The app confirms the encounter is open (status = "in-progress" or "arrived") and retrieves the encounter type to determine the appropriate NoteTypeID. An orthopedic follow-up maps to a different NoteTypeID than an ED visit or a telehealth encounter.
Audio Processing + NLP (concurrent): While the encounter is active, ambient audio is processed in near real-time. Clinical entities are extracted and validated against medical ontologies (SNOMED CT, LOINC).
SDE Mapping: Extracted entities are mapped to the organization's configured SmartData Elements. This mapping is established during implementation and stored in Scribing.io's configuration layer. Each SDE has a defined data type (numeric, coded value, free text) and validation rules.
Notes-Write API Execution: The assembled note payload—narrative sections + SDE values + NoteTypeID + encounter reference + author reference + draft status—is submitted via Epic's notes-write endpoint. The API validates NoteTypeID against the encounter type and returns success or a structured error.
NoteWriter Draft Confirmation: On success, the note appears in the clinician's NoteWriter queue for the active encounter. The clinician sees it immediately without navigating away from their current screen.
Total elapsed time from encounter close to NoteWriter draft availability: typically under 90 seconds for a standard office visit. The clinician never leaves Epic. No external window opens. No paste action occurs.
All-Party Consent State Compliance Framework
The orthopedic scenario specifies an all-party consent state, which introduces additional workflow requirements that the AI scribe must handle programmatically. Thirteen states (including California, Illinois, and Florida) require all parties to consent to recording. The NIH's analysis of ambient clinical documentation ethics emphasizes that consent must be informed, documented, and revocable.
Scribing.io's consent management layer addresses this through:
Jurisdiction detection: Practice location triggers the appropriate consent workflow (one-party vs. all-party).
Pre-recording consent prompt: The MA or clinician is required to acknowledge consent capture before recording initiates. The system will not begin audio capture without this acknowledgment.
Patient-facing notification: Configurable signage language and verbal script recommendations are provided during implementation.
Consent metadata: A timestamped consent record is attached to the encounter, creating an auditable trail.
Mid-encounter opt-out: If a patient revokes consent during the visit, the clinician can pause recording immediately. Previously captured audio for that segment is flagged per organizational policy.
This consent infrastructure is not optional—it is a prerequisite for any ambient AI scribe operating in all-party consent jurisdictions. Vendors who treat consent as a "customer responsibility" shift legal risk to the practice without providing the tooling to manage it.
CMIO Evaluation Criteria: Separating Real Integration from Marketing Claims
Based on the technical architecture described above, CMIOs should require vendors to demonstrate (not merely claim) the following during evaluation:
CMIO Vendor Evaluation: Critical Technical Questions | |||
Evaluation Criterion | What to Ask | Red Flag Response | Acceptable Response |
|---|---|---|---|
Note delivery target | "Where does the generated note appear in Epic?" | "In a sidebar" / "In Chart Review" / "We generate a PDF" | "As an editable draft in NoteWriter for the active encounter" |
Launch context | "What OAuth2 scopes does your app request?" | Cannot specify / does not include launch/encounter | "launch/patient, launch/encounter, notes-write capability" |
NoteTypeID handling | "How do you determine the correct NoteTypeID?" | "We use a default" / "The user selects it" | "Mapped from encounter type during implementation; validated per call" |
Discrete data capture | "Can payers query your output as structured data?" | "Our notes are very detailed" (narrative only) | "We populate SmartData Elements; discrete values are queryable" |
Click count (verifiable) | "Show me the clinician workflow from dictation end to signed note" | Involves copy, paste, or navigation to a different Epic module | 2–3 clicks: review draft → cosign |
Cosign routing | "Does the note auto-route for cosign?" | "The clinician signs it manually" | "Cosign routing is configurable per provider role and note type" |
Consent state handling | "How do you handle all-party consent states?" | "That's the practice's responsibility" | "Jurisdiction-aware consent workflow with timestamped documentation" |
A vendor who cannot answer these questions with specificity has not built a true NoteWriter integration. They have built a transcription engine with a marketing website that says "Epic compatible."
Book a Technical Demo: Live NoteWriter Push in Epic Sandbox
The claims in this playbook are verifiable in 20 minutes. Scribing.io offers live demonstrations in an Epic sandbox environment where CMIOs and their technical teams can observe:
Encounter-scoped launch: Watch the OAuth2 flow pass the encounter FHIR ID to the application in real time.
Notes-write execution: See the API call fire and the NoteWriter draft appear in Hyperspace within seconds.
NoteTypeID mapping: Observe how the system selects the correct note type based on encounter context.
SmartData Element prefill: Verify that dictated clinical findings populate discrete SDE fields in the Physical Examination section.
Cosign routing: Confirm that the draft routes to the appropriate provider per configurable rules.
Click count verification: Count the clinician interactions from draft appearance to signed note. It will be 2–3.
Book a 20-minute technical demo to see a live Smart-on-FHIR NoteWriter push (encounter-scoped launch + notes write + NoteTypeID mapping + SmartData Element prefill with optional cosign routing) in an Epic sandbox—no copy-paste and 20+ clicks saved per note. Contact the Scribing.io implementation team at scribing.io to schedule.
No paste window. No Chart Review filing. No 20 extra clicks. The note lands where it belongs—in NoteWriter, as an editable draft, with discrete data that coders can code and payers can query. That is the integration standard CMIOs should demand in 2026.

