Letting non-technical staff reconfigure systems holding sensitive personal data with AI-generated changes creates a compliance minefield for organizations subject to GDPR and CCPA.
Dyna Software’s AI ServiceNow config tool puts personal data at risk as developer checks disappear
Dyna Software, a Calgary-based ServiceNow Elite Build Partner, launched Platform Copilot in open beta on May 5, 2026, with full commercial availability targeted for July. The agentic AI tool is designed to let business analysts and process consultants configure ServiceNow instances using plain language descriptions or images of legacy forms, handling roughly 80% of the routine enhancement work that typically requires developer intervention. The tool connects directly to a customer’s ServiceNow development instance, reads existing schemas and configuration details, and generates wireframes and production-ready configs that are validated against the specific environment’s parameters.

Platform Copilot is built on Dyna’s existing Guardrails DevOps toolset, which enforces ServiceNow technical best practices to prevent upgrade conflicts and technical debt. Unlike generic AI coding tools such as Anthropic’s Claude or OpenAI’s Codex, which produce generic ServiceNow configs unless developers manually supply instance-specific details, Platform Copilot automatically pulls environment parameters to avoid conflicts. Dyna CEO Ron Browning demonstrated the tool at ServiceNow’s Knowledge 2026 event in Las Vegas, arguing that existing AI tools focus on enabling developers rather than business users, creating a bottleneck for routine configuration work.
“The goal is a situation where a business person fills in a form with their requirements, hits send, and the configuration is built and ready to deploy without technical folks involved,” Browning said. He acknowledged that complex application builds requiring custom coding or external integrations still need developer-led work, but said routine tasks like catalog items, workflows, and forms are prime targets for the AI tool. Early use cases include an Australian partner that migrated more than 200 catalog items from a legacy system in minutes rather than months, and government agencies digitizing PDF forms into ServiceNow portals, a process that previously took up to two years.
Legal basis: Privacy regulations apply to all ServiceNow configuration changes
ServiceNow instances are widely used by organizations to process personal data, including employee HR records, customer support tickets, access requests, and sensitive health or financial information. Under the GDPR, organizations must implement data protection by design and default (Article 25), meaning all system configurations must minimize data collection, restrict access to personal data, and ensure appropriate security measures from the outset. The CCPA imposes similar requirements, mandating that personal data be collected only for disclosed purposes (Section 1798.100) and protected with reasonable security measures (Section 1798.150).
Every configuration change made to a ServiceNow instance, from adding a new form field to adjusting role-based access controls, directly impacts how personal data is handled. Previously, these changes were made by developers who are typically trained in organizational data governance policies and privacy regulations. Platform Copilot shifts this responsibility to business users who may not understand these legal obligations, creating a gap in compliance oversight.
Impact on users and companies: Fines, breaches, and liability risks
Affected parties include the organizations using Platform Copilot, the data subjects whose personal data is stored in ServiceNow instances, and Dyna Software itself. For organizations, AI-generated misconfigurations could lead to serious regulatory violations. For example, if a business user asks Platform Copilot to create a new employee onboarding form that collects unnecessary personal data, such as marital status or health information not required for the role, that violates GDPR’s data minimization principle. If the tool misconfigures access controls, allowing unauthorized staff to view sensitive personal data, that constitutes a data breach under GDPR Article 33, requiring notification to regulators within 72 hours and to affected data subjects without undue delay.
Fines for GDPR violations can reach up to 4% of an organization’s global annual revenue, or €20 million, whichever is higher. CCPA violations carry fines of up to $7,500 per intentional violation and $2,500 per unintentional violation. For the Australian partner that migrated 200 catalog items using Platform Copilot, a failure to include proper consent checkboxes or data retention rules in the AI-generated configs could lead to widespread non-compliance across all those items, multiplying potential fines. Government agencies using the tool to digitize citizen-facing forms are subject to additional sector-specific regulations, such as HIPAA for health data or FERPA for student records, which carry their own penalties for non-compliance.
Data subjects, including employees and customers, face direct harm if their personal data is exposed due to misconfigurations. Exposed data could lead to identity theft, fraud, or unauthorized surveillance, violating their fundamental right to privacy under GDPR Article 8 and similar provisions in other laws. Dyna Software could face liability if the tool’s outputs consistently fail to meet privacy requirements, especially since the company markets Platform Copilot as compliant with ServiceNow best practices without explicitly addressing data protection laws.
What changes: Compliance processes must adapt to AI-led configurations
Organizations adopting Platform Copilot will need to overhaul their data governance processes to account for non-developer-led changes. This includes mandatory privacy training for all business users accessing the tool, covering basic requirements like data minimization, purpose limitation, and access control principles. All AI-generated configurations must be audited by a privacy officer or trained developer before deployment to ensure compliance with GDPR, CCPA, and internal data policies. Organizations should also maintain detailed logs of all changes made via Platform Copilot to provide audit trails for regulators during compliance checks.
Dyna Software may need to add privacy-specific safeguards to Platform Copilot to address these risks. Currently, the tool’s Guardrails foundation only enforces ServiceNow technical best practices, not privacy regulations. Adding automated checks for data minimization, consent mechanisms, and data retention rule enforcement would help reduce compliance risks for customers. The company could also partner with privacy compliance firms to certify that Platform Copilot’s outputs meet GDPR and CCPA requirements.
Regulators in the EU and California are likely to issue guidance on the use of AI tools for configuring systems that process personal data. This guidance will likely clarify that organizations remain fully liable for all system changes, even if generated by third-party AI tools, and may require mandatory audits of AI-generated configs for high-risk data processing systems. For data subjects, the shift highlights the need for greater transparency around how their personal data is handled in enterprise systems like ServiceNow, including new rights to request audits of configurations that affect their data.
Platform Copilot’s usage-based pricing model, with a $100 minimum credit purchase and no subscription commitment, lowers the barrier to entry for small and mid-sized organizations that may have even fewer resources to conduct compliance audits. This increases the risk of widespread non-compliance, as these organizations may not have dedicated privacy teams to oversee AI-generated changes.
As Browning noted, developers will not disappear entirely, but the routine, high-volume work that makes up 80% of ServiceNow backlogs is now being handled by non-technical staff. Without proper safeguards, this shift will lead to more privacy breaches, higher fines, and greater harm to data subjects whose personal data is at stake.

Comments
Please log in or register to join the discussion