Recently, I've been working on an OCR system focused on reading digital medical prescriptions. The goal is simple: to allow healthcare professionals and pharmacies to automate the interpretation of medical prescriptions from images sent via API. All of this with security, scalability, and without the headache of credential management.
In this first article of the series, I'll show how I designed the architecture, the justification for using Azure Functions, and how I integrated with Blob Storage using Managed Identity. By the end, we'll have a functional endpoint for secure image upload. Shall we?
Why Azure Functions?
The choice of Azure Functions came naturally for several reasons:
- Serverless: I don't need to worry about infrastructure.
- Scalable: The system will handle many image uploads, so scaling on demand is essential.
- Integrated with the Azure ecosystem: especially Blob Storage and Managed Identity.
And since we're talking about an OCR system, where the main trigger will be the submission of an image for analysis, an HTTP Function fits perfectly.
About the project architecture
I organized the code structure in a clean and modular way, following a light DDD approach.
/ocr-function-app
├── application/
│ └── UploadImageService.ts
├── domain/
│ └── IImageStorage.ts
├── infrastructure/
│ └── AzureBlobStorage.ts
├── HttpAddToBlob/
│ └── index.ts
│ └── function.json
├── host.json
├── local.settings.json
└── package.json
The idea is for the Function to be just the entry point, delegating responsibilities to more specific layers.
⛏️ Setting Up the Azure Function Environment
Before we start coding, we need to ensure that our Azure Function environment is ready to deploy and run correctly with Managed Identity authentication and Blob Storage integration.
- Create your Azure Function in the portal or via CLI:
func init ocr-function-app --worker-runtime node --language typescript
- Create the HTTP trigger Function:
func new --name HttpAddToBlob --template "HTTP trigger"
📦 Packages Used
You will need to install the following packages in your TypeScript project with Azure Functions:
npm install @azure/storage-blob
npm install @azure/identity
These packages will be used to:
- Create the HTTP trigger Function (
@azure/functions
) - Interact with Blob Storage (
@azure/storage-blob
) - Use Managed Identity authentication (
@azure/identity
)
🔗 Connecting Your Azure Function to Blob Storage Securely
At this point, we've set up the base of our Function App and have the code ready to receive images. Now comes a crucial part: ensuring the application can access Azure Blob Storage in a secure and scalable manner.
The idea here is to avoid using sensitive connection strings in your code or environment variables, opting for something much more secure: Managed Identity + Service Connector.
✅ Why Use Service Connector?
When we use DefaultAzureCredential
in the code, Azure already knows we want to use a managed identity (Managed Identity) for authentication. But for this to work in practice, we need to ensure this identity has access permissions to the Blob.
The Service Connector acts as a facilitator: it creates the connection between resources securely and without hassle, and also takes care of network configurations and permissions for you.
⚙️ Creating the Connection to Blob
- In the Azure portal, go to your Function App.
- In the side menu, click on Service Connector > + Add. !Service Connector
- Fill in the options as follows:
Target Resource: select your Azure Storage account.
Connection Name: something likeBlobConnection_OcrApp
.
- Authentication Type: select User Assigned Managed Identity (this is what makes everything more secure).
- In the network step, you can leave the default settings. Azure will ensure your Function App can communicate with the Storage.
- Click Next: Review + Create and then Create.
This process takes a few minutes, but once it's done, your Function will be ready to access the Blob with all the security that the cloud can offer.
🔐 Permissions: Granting Access to the Container with Service Connector
After creating the connection, the service connector ensures that the managed identity of the Function will have permission on the Blob container.
How to validate this?
- Access the storage account in the Azure portal.
- Go to Access Control (IAM).
- Click on Role Assignments.
- You will see the identity used with permissions similar to
ocr-umi
:
Save the changes. Done! Now your Function App can securely upload to the Blob without needing to store any credentials in the code.
📦 In the Code, the Implementation is Simple!
In the code, you will configure the DefaultAzureCredential
. It will automatically use the identity configured by the Service Connector.
const credential = new DefaultAzureCredential({
managedIdentityClientId,
});
With this, your backend is much more secure—and ready to scale without headaches.
🔐 Environment Variables
Environment variables are essential for securely configuring access to Blob Storage using managed identity:
local.settings.json (for local development only)
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "node",
"AZURE_STORAGEBLOB_RESOURCEENDPOINT": "https://<accountname>.blob.core.windows.net",
"AZURE_STORAGEBLOB_CONTAINERNAME": "ocr-container",
"AZURE_STORAGEBLOB_CLIENTID": "<client-id-of-managed-identity>"
}
}
The
AZURE_STORAGEBLOB_CLIENTID
should contain the Client ID of the User Assigned Managed Identity created in Azure, which will be used by the Function to access the Blob.
Secure Upload Using Managed Identity
A critical point here is the image upload to Azure Blob Storage. Instead of using a hardcoded connection string (which would be a security risk), I opted to use Managed Identity as mentioned earlier.
The flow is as follows:
- The image arrives via HTTP.
- The Function authenticates with Blob Storage via Managed Identity.
- The image is securely saved in the container.
- The image URL is returned to the API caller.
The magic begins in the application service:
export class UploadImageService {
constructor(private readonly imageStorage: IImageStorage) {}
async handleUpload(buffer: Buffer): Promise<{ url: string; fileName: string }> {
const fileName = `${uuidv4()}.png`;
const url = await this.imageStorage.uploadImage(buffer, fileName);
return { url, fileName };
}
}
Here, the UploadImageService
follows the principle of dependency injection, working with the IImageStorage
interface, which facilitates testing and decouples business logic from the actual storage implementation.
Implementing Storage with Azure Blob
The concrete implementation of IImageStorage
is the AzureBlobStorage
class, which encapsulates all interaction with the Azure SDK.
export class AzureBlobStorage implements IImageStorage {
private readonly blobServiceClient: BlobServiceClient;
constructor(
url: string,
private readonly containerName: string,
credential: DefaultAzureCredential) {
this.blobServiceClient = new BlobServiceClient(
url,
credential
);
}
async uploadImage(buffer: Buffer, fileName: string): Promise<string> {
const containerClient = this.blobServiceClient.getContainerClient(this.containerName);
const blobClient = containerClient.getBlockBlobClient(fileName);
await blobClient.upload(buffer, buffer.length);
return blobClient.url;
}
}
Notice that the BlobServiceClient
is instantiated with DefaultAzureCredential
, which automatically uses the Managed Identity of the Function App in Azure, without exposing secrets in the code.
The Function That Receives the Image
At the API layer, we have an Azure Function that serves as the entry point for the application:
import { AzureFunction, Context, HttpRequest } from "@azure/functions";
import { BlobServiceClient } from "@azure/storage-blob";
import { DefaultAzureCredential } from "@azure/identity";
import { AzureBlobStorage } from "../infrastructure/AzureBlobStorage";
import { UploadImageService } from "../application/UploadImageService";
// Loading environment variables
const blobUrl = process.env.BLOB_STORAGE_URL!;
const containerName = process.env.BLOB_CONTAINER_NAME!;
const managedIdentityClientId = process.env.AZURE_STORAGEBLOB_CLIENTID!;
const httpTrigger: AzureFunction = async function (context, req) {
if (!req.body || !req.headers["content-type"]?.startsWith("image/")) {
context.res = { status: 400, body: "Invalid or missing image" };
return;
}
const buffer = Buffer.isBuffer(req.body) ? req.body : Buffer.from(req.body);
const credential = new DefaultAzureCredential({
managedIdentityClientId,
});
const blobStorage = new AzureBlobStorage(blobUrl, containerName, credential);
const uploadService = new UploadImageService(blobStorage);
const { url } = await uploadService.handleUpload(buffer);
context.res = {
status: 200,
body: {
message: "Image successfully stored",
url,
},
};
};
The example below demonstrates the execution of the function on Azure via curl:
With just a few blocks of code, we’re able to build a complete, clean, and secure upload pipeline.
Security Considerations
This approach offers several advantages:
✅ No connection strings in the code
✅ Use of Managed Identity to authenticate with Blob Storage
✅ Clear code organization, separating the Function from business logic
🔄 Next Steps
With the image successfully stored, the next step will be:
Integrating with the Azure SQL database, where we’ll model and store the extracted data from the medical prescription before, during, and after OCR processing.
We’ll register data such as:
- Image ID
- Upload date
- Text extracted via OCR
- Processing status
All of this while maintaining security and scalability.
🚀 Next step: Part 2 - Persisting Data in Azure SQL with Best Practices
Top comments (0)