Impact

Understanding BYOAI: The emerging threat to your company

Enrique Dans|Published

.

Image: Freepik

Months ago, I warned about “shadow AI”: employees moving faster than their companies, using AI without permission or training, while managers pretended not to notice. The right response was never prohibition but education and better governance.

That was only the first signal of something bigger: BYOA or BYOAI, “bring your own algorithm” or “bring your own AI.”

Now the trend is visible everywhere: workers are embedding their own agents into daily workflows, while companies scramble to bolt on controls after the fact.

The comparison with the old BYOD is misleading — this is not about carrying a device, but about bringing in a cognitive layer that decides, infers, and learns alongside us. Now, recent evidence makes this gap even harder to ignore

The data backs it up: Microsoft’s Work Trend Index already noted in 2024 that three out of four employees were using AI, and that 78% of them were “bringing it from home,” without waiting for corporate tools.

This isn’t marginal: it’s the new normal in an overworked environment where AI becomes a cognitive shortcut. The 2025 report goes further, warning that today’s workload “pushes the limits of the human” and that the real frontier organisations will be those that adopt human–agent collaboration as their default architecture. Governance, meanwhile, is still lagging behind.

Even so, the soothing corporate narrative (“we’ll provide official access and train everyone soon”) ignores an uncomfortable fact: BYOAI is not a fad; it’s an asymmetry of power. Half of all employees admit to using unapproved tools, and they wouldn’t stop even if you banned them. The incentive is obvious: less friction, higher performance, and with it, better evaluations and opportunities.

This “shadow AI” is the natural extension of “shadow IT,” but this time with qualitative consequences: an external model can leak data, yes, but it can also accumulate the organisation’s tacit knowledge — and walk out the door with the employee the day they leave.

Sociology not technology

The real shift is not technological, it’s sociological. Most users, the nonexperts, will simply adopt whatever OpenAI, Google, Microsoft, Perplexity, Anthropic, or others give them, using those models like cognitive appliances: plug and play, nothing more, and they will share all the data they generate with these companies, which will exploit them (aka “monetise them”) to oblivion. 

But a different type of professional is already emerging: the truly competent, AI-savvy users who build or assemble their own agents, feed them with their data, fine-tune them, run them on their own infrastructure, and treat them as part of their personal capital. This person no longer “uses software”: they work with their personal AI. Without it, their productivity, their method, and even their professional identity collapse. Telling them to abandon their agent to comply with a corporate list of “approved tools” is like telling a professional guitar player to play on a toy guitar. The result will always be worse. 

This reality forces companies to rethink incentives. If you want that caliber of talent, you cannot hope to blunt their edge with policy memos. Just as BYOD ended with corporate devices inside secure containers, BYOA will end with enclaves of trusted compute inside the corporate perimeter: spaces where a professional’s personal agent can operate with model attestation, sealed weights, clearly defined data perimeters, transparent telemetry, and cryptographic limits. The goal is not to standardise agents, but to make their coexistence possible — safe for the business, free for the professional.

The prognosis

My prognosis is clear: first of all, contracts will evolve too. Expect “algorithmic clauses” spelling out the use of personal agents: declaration of models and datasets, isolation requirements, audit rights over outputs that shape key decisions, and obligations of portability and deletion when employment ends. Alongside that, new perks will emerge: compute stipends, inference credits, subsidies for local hardware or edge nodes. 

Second, security and compliance will shift from the fantasy of “eradicating shadow AI” to the reality of managing it: explicit, inventoried, and auditable. Companies that get this sooner will capture the value. Those that don’t will keep bleeding talent. 

Third, this will exacerbate the competition for talent, and management culture will also need to grow up. The manager who clings to tool uniformity will drive away exactly those employees who make AI a force multiplier. The metric that matters will not be obedience to corporate software lists, but performance that is verifiable and traceable. Do you have employees like this in your company? If so, protect them at all costs. If not, you should be worried—because the best talent doesn’t even consider working with you. 

Leaders will have to learn to evaluate human–machine outcomes, to decide when to delegate to the agent and when not to, and to design processes where hybrid teams are the default. Ignoring this is not caution — it’s a gift to your competitors. 

Denial is a waste of time

This is why BYOA is not a discipline problem. It’s the recognition that knowledge work is already mediated by agents. Denying it is a waste of time. Accepting it means more than licenses and bans: it requires redrawing trust, responsibility, and intellectual property in an economy where human capital literally arrives with its own algorithm under its arm. 

The organisations that understand this will stop asking whether they “allow” people to bring their AI, and start asking how to turn that fact into a strategic advantage. The rest will keep wondering why the best people don’t want to, or simply cannot, work without theirs.

ABOUT THE AUTHOR

Enrique Dans has been teaching Innovation at IE Business School since 1990, hacking education as Senior Advisor for Digital Transformation at IE University, and now hacking it even more at Turing Dream. He writes every day on Medium. 

FAST COMPANY