mtoto.news

Categories
Child Rights latest latest Latest News

Kenya’s Child Data Protection Framework: How the Draft Compares Internationally and What Still Needs Work

 

 

By Jermaine Magethe

 

 


 

In May 2025, the Office of the Data Protection Commissioner (ODPC) released a Draft Guidance Note on Processing Children’s Data, a highly anticipated reckoning with the manner in which institutions, platforms, and public systems handle children’s personal data. Only weeks earlier, the ODPC released the Draft Data Protection (Conduct of Compliance Audit) Regulations, 2024, outlining the machinery for how data controllers’ and processors’ audits will be carried out throughout the country.

Collectively, these two draft papers suggest that Kenya is at last building the institutional structure required to bring the Data Protection Act (2019) into operation for vulnerable groups like children. Relative to the UK’s, EU’s, or even South Africa’s regulatory frameworks, however, it can be seen that Kenya’s system remains structurally reactive and fragmented, particularly within child protection.

The intentions exist. So does missing integration, enforcement specificity, and proactive protection.

The draft guidance on children’s data in Kenya has a strong rights-based approach. It properly defines a child as under 18 years, places greatest weight on the best interests principle, demands parental consent (or child assent based on maturity), and resists profiling, data monetization and exploitative digital design. It also advances child-friendly communication and data minimization.

In addition, the draft rules of compliance audits provide a welcome set of guidelines for accountability. They make it possible for the ODPC to initiate audits based on complaints, risk, or severe breaches. They allow private bodies to do this voluntarily. And they set guidelines for accrediting independent auditors, reporting times, and follow-up processes.

But this is where things go wrong: the two frameworks do not talk to each other.

There is no requirement, for instance, that organizations that are designing services for children such as EdTech platforms, health apps, or youth NGOs put themselves up for periodic audits. There is no requirement for risk assessments of child-facing features or services by data controllers. And although both the guidance and Kenya’s Children Act do have an explicit mention of the best interest of the child, there is no functional detail as to how this principle would guide audit triggers, design, or outcomes.

That is, Kenya wrote down a code of ethics. But it has not yet built the machinery to make it work, especially not in jurisdictions where children’s data are at risk.

The United Kingdom is a source of emulation for Kenya to emulate. The UK has since 2021 applied its Age-Appropriate Design Code (also called the Children’s Code), which sets 15 enforceable criteria for digital services likely used by children. These criteria go beyond consent to require privacy by default, turn off profiling and geolocation unless justified, and prohibit manipulative design methods called “dark patterns.”

What is distinctive to the UK, however, is its enforcement regime. Whereas data protection falls to the Information Commissioner’s Office (ICO), the UK communications regulator Ofcom was entrusted with responsibility for regulating platform safety more broadly by the Online Safety Act (2023). Ofcom can now compel platforms to conduct safety risk assessments, impose age assurance mechanisms, and fine them for failure to comply.

That is to say, the UK does not simply issue codes of conduct. It makes platforms prove that they are enforcing them and equips regulators with teeth in which to probe.

Kenya’s compliance audit regulations are technically a good fit in form but not yet in function. They allow for audits but don’t require them for child-directed services. They specify procedures for determining compliance but don’t specify what child-specific requirements should be assessed. And they create no link between platform architecture, algorithmic risk, or monetization strategies and regulators’ supervision.

The EU’s General Data Protection Regulation (GDPR) is taken by most to be the gold standard of personal data regulation. When addressing children, the GDPR enshrines their right to access, rectify, erase, and object to processing of personal data. It requires parental consent for children under 16 years of age (although nations can lower this to 13) and regards children’s data as having “special protection.”.

But notably, the GDPR also imposes a proactive obligation: any organization undertaking high-risk processing like profiling or monitoring, or mass surveillance, will have to conduct a Data Protection Impact Assessment (DPIA). This tool has organizations flag risks to children and suggest mitigations prior to harm being done.

There is no such requirement in Kenya. The audit regulations contain a strong procedure framework but lacks upstream risk detection mechanisms. No DPIA requirement. No automatic audit threshold for child-facing services. No national high-risk data use baseline involving children. This makes the system reactive rather than preventive.

South Africa’s Protection of Personal Information Act (POPIA) mirrors much of Kenya’s Data Protection Act but is more in the case of children. It affirmatively prohibits processing children’s personal data unless one of three conditions is met: legal obligation, consent of the guardian, or manifest compatibility with the best interests of the child.

Importantly, POPIA links this protection to enforcement. The Information Regulator can initiate investigations, issue enforcement notices, and hold organizations dealing with children’s data accountable. While Kenya’s audit laws do leave room for the ODPC to audit based on complaints, public interest, or contravention, they do not render children’s data a risk category in itself.

The lack of it makes a difference. In the virtual world of today where kids’ information is easily reaped discreetly via app permissions, cookies or third-party plugins, enforcement must be risk-based, not incident-based.

Loopholes Kenya Needs to Plug

  1. Kenya’s audit laws and children’s data guidelines are strong in intent but weak in bite. To meet international best practice, some areas need tightening up:
  2. Risk-Based Auditing: Children’s processing of data has to be described as inherently risk-sensitive, calling for mandatory audits on children’s platforms, institutions, and products.
  3. Data Protection Impact Assessments (DPIAs): These have to be mandatory for every high-risk processing of data about children particularly where profiling, AI, or automated decision-making are involved.
  4. Cross-Agency Coordination: ODPC and the Communications Authority of Kenya (CAK) must develop bilateral audit guidelines and regulatory master plans, especially on digital media where content, data, and commerce converge.
  5. Child-Centric System for Redress: Audit results concerning children should be followed by redress channels that are available to children and caregivers to understand and utilize especially where data rights are violated.
  6. Open Audit Reporting: While Kenya’s audit law has public reporting provisions, these must explicitly include child-related conclusions and, where appropriate, be presented in child-friendly terms.

Kenya has made unmistakable, commendable strides. The children’s data guidance is ethical and safeguarding. The audit rules are comprehensive and administratively cautious. But without mandatory links between the two, children remain under-protected in practice.

Protecting children online requires more than good intentions. It requires a system in which institutions and platforms have a sense of responsibility not just when things fail, but ahead of time, so that harm does not occur.

Kenya does not need to invent the wheel. It needs to connect the ones that it already has.

Leave a Reply

Your email address will not be published. Required fields are marked *