If your compliance evidence lives in a shared drive full of screenshots, a spreadsheet that someone updates manually each quarter, and an email chain that references the “current version” of a policy document that has been renamed three times — this post is for you.
The problem is not unique to small organisations. I have seen it in regulated financial institutions. The root cause is almost always the same: compliance evidence collection was bolted onto existing processes rather than built into them. Someone’s job description includes “collect evidence for the audit” as a quarterly task, and so it gets done quarterly — inconsistently, under time pressure, and without a reliable audit trail.
Compliance-as-Code is the architectural shift that fixes this.
The Problem with Manual Evidence
Let’s be specific about what manual compliance evidence actually looks like in practice.
It is the GRC analyst who takes a screenshot of the MFA enforcement policy status in the Microsoft 365 admin panel every quarter. The screenshot is saved to a folder with a date in the filename, which everyone agreed was the standard but which has drifted across three formats over the past year. The folder is in SharePoint, which has had two reorganisations. The analyst who originally set up the folder has left.
When the auditor asks for 12 months of MFA enforcement evidence, someone spends three days tracking down every quarterly screenshot, confirming the dates, verifying that the screenshots are genuine, and worrying that there is a gap in Q2 because that was during the team transition.
This is expensive, unreliable, and not particularly auditable. An auditor who wants to be rigorous has no way to verify that a screenshot taken in April was not taken last week.
What Compliance-as-Code Means
Compliance-as-Code is the practice of automating compliance evidence collection using code and APIs, storing that evidence in version-controlled systems, and generating compliance reports programmatically rather than manually.
In concrete terms, it means:
- Instead of a screenshot, an API call pulls the current state of a control and saves the structured JSON response to a Git repository with a timestamp
- The Git commit hash provides cryptographic proof that the evidence has not been modified since collection
- The pipeline runs on a schedule — daily, weekly, or monthly — without anyone remembering to trigger it
- A report generator assembles the evidence into a structured document that maps findings to control requirements
The evidence is continuous. The collection is consistent. The audit trail is built in.
A Practical Example: Microsoft 365 MFA Status via Graph API
Here is a real example of pulling MFA registration status for all users in a Microsoft 365 tenant using the Microsoft Graph API.
import requests
import json
from datetime import datetime, timezone
from pathlib import Path
def get_access_token(tenant_id: str, client_id: str, client_secret: str) -> str:
"""Obtain an access token using client credentials flow."""
url = f"https://login.microsoftonline.com/{tenant_id}/oauth2/v2.0/token"
payload = {
"client_id": client_id,
"client_secret": client_secret,
"scope": "https://graph.microsoft.com/.default",
"grant_type": "client_credentials",
}
response = requests.post(url, data=payload)
response.raise_for_status()
return response.json()["access_token"]
def get_mfa_registration_status(access_token: str) -> list[dict]:
"""Fetch MFA registration details for all users."""
url = "https://graph.microsoft.com/v1.0/reports/credentialUserRegistrationDetails"
headers = {"Authorization": f"Bearer {access_token}"}
users = []
while url:
response = requests.get(url, headers=headers)
response.raise_for_status()
data = response.json()
users.extend(data.get("value", []))
url = data.get("@odata.nextLink") # Handle pagination
return users
def save_evidence(data: list[dict], output_dir: Path) -> Path:
"""Save evidence with timestamp to the evidence directory."""
output_dir.mkdir(parents=True, exist_ok=True)
timestamp = datetime.now(timezone.utc).strftime("%Y%m%d_%H%M%S")
filename = output_dir / f"mfa_status_{timestamp}.json"
evidence = {
"collected_at": datetime.now(timezone.utc).isoformat(),
"control": "MFA Enforcement",
"framework_reference": "CIS M365 Benchmark v3.0 - Control 1.1.1",
"user_count": len(data),
"mfa_registered": sum(1 for u in data if u.get("isMfaRegistered")),
"mfa_not_registered": sum(1 for u in data if not u.get("isMfaRegistered")),
"detail": data,
}
with open(filename, "w") as f:
json.dump(evidence, f, indent=2)
return filename
# Usage
if __name__ == "__main__":
import os
token = get_access_token(
tenant_id=os.environ["AZURE_TENANT_ID"],
client_id=os.environ["AZURE_CLIENT_ID"],
client_secret=os.environ["AZURE_CLIENT_SECRET"],
)
users = get_mfa_registration_status(token)
evidence_file = save_evidence(users, Path("./evidence/mfa"))
not_registered = [u for u in users if not u.get("isMfaRegistered")]
print(f"Evidence saved: {evidence_file}")
print(f"Total users: {len(users)}")
print(f"MFA registered: {len(users) - len(not_registered)}")
print(f"MFA not registered: {len(not_registered)}")
if not_registered:
print("\nUsers without MFA registered:")
for user in not_registered:
print(f" - {user.get('userPrincipalName')}")
This script, run daily via a cron job or GitHub Actions workflow, produces a dated evidence file in a Git repository. Every run is a new commit. The full history is preserved and auditable. The report at the end of the quarter is generated from the collected evidence, not assembled manually.
Benefits Beyond Audit Efficiency
The operational benefits of this approach extend beyond making auditors happy.
Continuous visibility. When evidence is collected daily rather than quarterly, you see drift immediately. If MFA registration drops because new accounts were added without MFA setup, you know within 24 hours rather than at the next quarterly review.
Consistency. The API returns the same structured data every time. There is no variation based on who ran the check, what window was open when the screenshot was taken, or whether the analyst remembered to include all tenants.
Scalability. Adding a new control to the pipeline is a matter of writing another function. You are not adding to someone’s manual workload — you are adding to a system that runs automatically.
Defensibility. Git commit hashes provide cryptographic proof of when evidence was collected and that it has not been modified. This is far more defensible under scrutiny than a folder of screenshots.
How Tanzanian SMEs Can Start Small
You do not need to automate everything at once. The right starting point is the control that causes the most pain in your current evidence process — the thing that takes the most time or that you worry about most at audit time.
For most SMEs on Microsoft 365, that is MFA status. The script above is a working starting point. With a few environment variables and a scheduled task, you have continuous, auditable evidence collection for one of the most important controls in your environment.
From there, the pipeline grows: external sharing settings, audit log exports, guest account lists, admin role assignments. Each one is another API call, another evidence file, another section of the quarterly report that writes itself.
The goal is not to eliminate human judgement — it is to eliminate the manual, error-prone, time-consuming parts of evidence collection so that human attention can focus on analysis, improvement, and response.
A Note on Tooling
The example above uses Python and the Microsoft Graph API. The same approach works with:
- Azure Resource Manager API for Azure infrastructure compliance evidence
- Cloudflare API for DNS and security configuration evidence
- Resend or email API for DMARC report aggregation
- Virtually any modern SaaS tool with an API — most compliance-relevant platforms have one
The code does not need to be sophisticated. It needs to be reliable, documented, and version-controlled. A 50-line Python script that runs daily and commits JSON to a Git repository is worth more than an enterprise GRC platform that nobody has time to maintain.
If you are interested in building compliance automation pipelines for your organisation’s specific framework requirements, our Compliance-as-Code Engineering service covers the full engagement — from assessment through to working, documented pipelines and a handover session.