The Committee for Artificial Intelligence and Other Moral Quandaries

When I first started leading a Data Governance Council, my friend Claire said it sounded like something you’d need a ceremonial robe for: something where people chant in unison before declaring that, yes, Excel files stored on individual desktops are still a problem. Instead, it turned out to be six people, three of whom never turned on their cameras, discussing the ethics of machine learning while a golden retriever barked in the background.

The topic that first day was data ethics, which, for those outside the club, is the art of making sure your company doesn’t accidentally build the Terminator. Claire was the facilitator, and she started the meeting with, “Our goal today is to make sure our data is accessible, accurate, and ethically used.”

“Like a priest’s confession log,” I offered.

Claire smiled politely. “Not quite.”

The problem, she explained, was that everyone in our client’s company wanted to use Artificial Intelligence but no one wanted to do the unsexy work required to feed it clean, fair data. They wanted self-driving insights, but their data was closer to a rusted tricycle with one training wheel missing.

“AI is only as good as the data it learns from,” Claire said. “If our data is biased, incomplete, or poorly governed, our AI will be too.”

Which made perfect sense, though it reminded me of my childhood report cards. Garbage in, garbage out.

The Council’s role, apparently, was to stop that garbage. Ethically, of course. That meant defining who could access what, ensuring that data was accurate, and establishing availability rules that balanced innovation with privacy. In other words, we were the digital version of the neighborhood watch. Only instead of suspicious strangers, we were keeping an eye on spreadsheets and other exciting file types.

At first, it felt absurd. But as the meetings went on, I began to see the point. Data, in the modern corporate world, is a lot like gossip: powerful, easily misused, and always traveling faster than it should. And just like gossip, it’s the lack of context for your data where danger comes into play: the twisting, the casual sharing with people who shouldn’t have it.

That’s where the Governance Council came in. We started small, naming data owners, cataloging sources, and defining what “good” data actually meant. You’d think this would be obvious, but when three departments define “customer” differently, you start to understand why your AI thinks half your clients are imaginary.

We built rules for access, deciding who gets to see what and when, and set up guardrails for data quality. We discussed the ethics of anonymization, retention, and model transparency. There were arguments about bias mitigation and consent. One particularly heated exchange involved whether our client’s AI chatbots should be allowed to suggest financial products based on age.

“So, like, no more ‘Because you’re 40, here’s a midlife crisis loan?’” I asked. Claire did not laugh.

Over time, I realized the Council wasn’t just a bureaucratic nuisance like I had always wondered. It was a moral compass for a machine age. Rather than deciding whether AI COULD do something, we were deciding whether it SHOULD. In the process, we redisered something quaint and almost human: responsibility.

Setting up a Data Governance Council doesn’t make your company holy. It won’t save the world from bad algorithms or prevent a rogue intern from uploading confidential data to ChatGPT. But it creates a space, a simple pause button, where smart people can ask hard questions before the code starts running. Questions like: Should we use this data at all? Do the people it represents know we’re using it? If we’re wrong, who gets hurt?

Those aren’t IT questions. They’re ethical ones. And for all our dashboards and machine learning pipelines, ethics doesn’t live in code. It lives in the conversations, the uncomfortable, slow, and deeply human act of deciding what’s right.

After a few months, we had frameworks, policies, and more acronyms than the Pentagon. But we also had something rarer: trust. We could measure that in KPIs like uptime percentages and SLA compliance, sure, but what matted more were the intangibles. People believed that the data, the systems, and most importantly the people behind it, were worthy of belief.

Sometimes, during our meetings, I imagined the AI we were nurturing. It didn’t yet exist, but I imagined that, when it  back on its creators, the weary Council members on Teams calls, it would find us competent (hopefully) but more than that: decent. That’s the real goal of governance. Access, control, quality, and security are the easy part. Moral and decent are not. Decency in a world increasingly run by machines that don’t know what that word means is a lofty and challenging, but ultimately worthy goal.

If we can manage that, if our Data Governance Councils can help us use data with integrity and care, maybe we deserve to call ourselves intelligent too.

Skates in the Fall

The bus hissed as it came to a stop, exhaling a cloud of exhaust that mingled with the damp, metallic scent of the city. The woman stepped down carefully, one hand clutching the railing, the other gripping the strap of her bag. Her shoes were sensible, brown leather with scuffed toes. Her coat hung oddly from her shoulders, too large, perhaps made for a broader woman. Her skirt had a stubborn crease running diagonally across it, as if it had given up trying to look proper halfway through the morning. 

Around her, the crowd surged, faces blurred, eyes averted, their footsteps a clamor that swallowed her silent grace. The air hung heavy, thick with rust and the faint musk of leaves, whispering rain to come.

She began to walk. The crowd around her surged and scattered, people pushing past like a tide unwilling to acknowledge the existence of a small, unremarkable rock in its path. She looked neither left nor right, her chin slightly tucked, her glasses fogged. Her bag, a large and ornate leather thing embroidered with fading gold thread, bumped against her knee as she went. It looked absurdly out of place. Too fine for her, too old for the time.

At the crosswalk, a yellow taxi screeched to a halt inches from her knee. The driver, red-faced, shouted something vile from behind the window. She flinched just slightly, but did not stop. Her expression did not change. She stepped forward, her feet steady, and crossed.

After a few blocks, she reached a park, an urban pretense of mercy, where concrete yielded to a wide square ringed by benches and trees, their branches half-stripped, leaves pirouetting on unseen breezes. Food trucks lined the square, their vents sizzling and rumbling, perfuming the air with onions and grease. A man in rags shook a cup for change. Another stood on a milk crate shouting inti the wind about salvation and the end of days. People sat eating their lunches, staring into phones, into the middle distance, into nothing. 

The woman moved through them, her bag swaying, its gold thread a faint pulse of light, her steps soft but sure. She found an empty bench and sat slowly, as if remembering how to do it, and placed her bag beside her. For a long while she did nothing. Then, with the unhurried grace of ritual, she opened the clasp and drew out a pair of roller skates: white leather, yellowed with age, their wheels polished to a dull gleam. They looked older than she did, though they had clearly been loved.

She laced the skates with care, fingers tracing the worn leather as if greeting an old friend. She stood. The wind stirred, lifting leaves in a quiet summons. 

Then, she moved. A slow glide at first, testing the ground’s pulse. Her body softened, swayed, found its song. Her skates sang against the concrete, a low, rolling murmur, like waves kissing a shore. Her arms rose, carving arcs in the heavy air, and her coat flared, catching the wind’s embrace. 

Leaves, brittle and gold, rose in her wake, twirling as if summoned to dance. She wove through them, her body a thread stitching earth to sky, her breath steady, matching the rhythm of her wheels. Her skin tingled as the wind brushed her cheeks, cool and sharp, like a lover’s fleeting touch. Each turn sent a shiver through her, her muscles warm, alive, as if the earth pulsed beneath her.

Around her, the world refused to pause. The preacher’s voice cracked, railing against unseen sins. A woman scrolled through her phone, thumb flicking past headlines. A man bit into his sandwich, mustard smearing his tie, eyes fixed on nothing. They did not see her spin, did not hear the soft hum of her wheels, did not feel the wind that danced with her. She was a secret kept by the air itself, unnoticed, unwitnessed, yet radiant.

She spun, and the world spun with her, leaves spiraling, wind curling, sky holding its breath. Her fingers brushed the air, tracing invisible lines, as if sketching a world only she could see. Her lips curved, not quite a smile, but a softening, as if she’d found something lost long ago. 

Then, as the first drops of rain kissed the pavement, she slowed, her skates whispering a final note. The wind fell still, reluctant to release her. She returned to the bench, her breath soft, her face flushed with a secret joy. 

One by one, she unlaced the skates, wiping their wheels with a handkerchief, tucking them back into her bag like a memory folded away. The leaves at her feet settled, damp and still, as the rain began to fall. Her dance, like the leaves, like the rain, was gone. nseen, unclaimed, but hers alone. The world resumed its indifferent rhythm.

She lifted the bag, stood, and walked back toward the street. The crowd swallowed her. No one turned to see her go. The rain fell harder, pinning leaves to the pavement like small, defeated birds.

The Keys To The Kingdom -or- Governance Isn’t A Four-Letter Word

When I first heard the word “governance” used in a meeting, I imagined a man in a powdered wig and culottes standing at the door of a server room, holding a clipboard and saying, “You shall not pass.” I pictured across between Gandalf and George Washington, keeping the unwashed data masses from sullying the sanctity of enterprise systems.

And for a long time, that’s exactly how we treated it. Governance was a velvet rope, and only the properly credentialed could step inside. We built forms, we built approval chains, and we built policies so thick you could use them as flotation devices in case of a compliance emergency.

Gates are effective and, in many cases, necessary. But the problem here is that, while they do a good job of keeping things out, they sometimes do TOO good a job of keeping things out. In the modern enterprise, where everyone from finance analysts to HR business partners is suddenly “building an app” or “running a flow,” keeping things out is the fastest way to make yourself irrelevant.

The old model of governance was simple: people are dangerous, so you must protect the system from the people. Every new connector was a potential scandal, every Power App a ticking time bomb. The governing body’s role was to say “no” gracefully, like a maître d’ at a restaurant that’s fully booked for eternity.

But then the world changed. Low-code platforms took the wheel, automation became the new oxygen, and the governance-as-gatekeeper approach started to creak under the pressure of its own usefulness. Suddenly, the people outside the rope were building anyway. They were wiring together approvals and forms and dashboards, not because they wanted to break the rules, but because they wanted to work.

We learned an important rule in this effort When you spend all your time keeping people out, they eventually stop knocking.

In corporate governance meetings, trust is the word that gets used like parsley: sprinkled on everything for flavor, but rarely meant. We talk about “building trust” while drafting 12-page forms that ask, “Why do you need this connector?” in three different sections. We say, “We trust our makers,” then build dashboards to monitor every keystroke. It’s the bureaucratic equivalent of saying, “Of course I love you,” while secretly running a background check.

Real trust doesn’t mean no oversight. It means assuming competence, not chaos. It’s giving someone the keys to the car and believing they’ll fill it with gas rather than drive it into the lake. Trust is also contagious. When governance teams stop policing and start partnering, something almost magical happens: people want to do things the right way. Because it’s their idea, not yours.

The dirty secret of governance is that most of what we call “noncompliance” is actually illiteracy. People don’t break rules out of malice; they break them because they don’t know the rules exist, or because the rules read like a cross between ancient Greek and IRS tax code. That’s where literacy comes in.

Modern Centers of Excellence aren’t libraries of rules. They’re classrooms of context. The best CoEs I’ve seen don’t issue edicts; they hold office hours. They teach people how to fish, then give them a well-documented rod and an FAQ. They translate “don’t use personal credentials for production” into “here’s how to use a managed service account and why it saves your weekend.”

It’s not about dumbing things down. It’s about lifting people up. When you build literacy, governance stops being a scary word and starts being a shared language. It’s like when you finally learn what “quarterly earnings” actually means. You may still not care, but at least you understand why someone else does

Now, I’ll admit, not everyone should have the same keys. Some people will absolutely drive the car into the lake. That’s where tiered governance comes in. It’s the art of saying “yes” at different speeds.

At its heart, tiered governance is about designing a system that assumes both brilliance and fallibility. You create a space for explorers, for the the makers who can prototype and learn, and another for professionals who can publish and scale.

It’s a little like parenting. You don’t let your kid use the stove the first time they ask, but you also don’t tell them they’ll burn down the house forever. You teach, you supervise, you adjust. Eventually, you hand over the spatula.

That brings us to the new rule of good governance It’s not about gates. It’s about growth. When you get it right, governance starts to feel less like airport security and more like a good dinner party. The CoE becomes a host, not a warden; someone who says, “Welcome! Let me show you where the good silverware is, and please don’t use the salad fork for soup.”

You want people to feel empowered, not inspected. You want them to leave the table knowing more than they did when they sat down. Maybe they’ll even send a thank-you email afterward.

Governance, done right, is is a gate, not a guide. It’s the quiet art of creating boundaries that help people thrive, not barriers that keep them small. So the next time someone calls your CoE “the gatekeepers,” smile politely, and hand them a key.

After all, what’s the point of building the kingdom if no one’s allowed inside?

Screaming Into The Void

Creating content online is a lot like screaming into the void, except the void doesn’t bother to echo. You spend hours hunched over your laptop like some starving poet, convinced you’ve concocted the perfect turn of phrase. Maybe it’s a blog post about the inherent tragedy of decorative throw pillows, or a TikTok where you lip-sync to a Céline Dion song while ironing a grilled cheese. You hit “publish,” sit back, and wait for the applause that never comes. Not even your parents click “like.” And they once liked a Facebook page dedicated to horse dentures, so it’s not like their standards are particularly high.

Nothing. Not even a “seen.” The void stares back, unimpressed.

This is the quiet tragedy of content creation. You can float the most unhinged ideas (ex: a podcast where every episode is just you describing pictures of sandwiches you find on Google), and the world will yawn. No outrage. No applause. No feedback of any kind. If the internet were your therapist, you’d switch providers immediately.

But then.

You make a single, inane comment on LinkedIn. Something as innocuous as, “I don’t think synergy is a real word.” Suddenly, the gates of hell creak open. Out pour the consultants, the career coaches, the people who list “visionary” as both a skill and a hobby. They descend upon you with the fury of a thousand unpaid interns.

“Excuse me,” someone will type, “but as a thought leader in the space of holistic disruption, I find your remark deeply offensive.” Another person, whose profile picture is an AI-generated portrait in front of a stock photo of a WeWork lobby will write an 800-word reply complete with bullet points, Harvard Business Review citations, and a chart in Comic Sans.

Apparently, the internet does not care when you post a surrealist video about vacuuming your yard, but God help you if you suggest that hustle culture might not be the pinnacle of human achievement. Then the scum rises. Not the bottom-feeders you’d expect, either. These are the self-proclaimed “builders,” the “connectors,” the men who describe themselves as “dad, runner, disruptor” in that order. They’ll tell you how wrong you are, how shortsighted, how negative. And they’ll do it with the kind of zeal usually reserved for defending family honor in a duel.

This is the paradox of the digital age: your boldest, strangest creations sink without a ripple, but misplace a single emoji on a corporate platform and suddenly you’re the Antichrist. Somewhere out there is a void waiting patiently for your screams, but the internet prefers you whisper something stupid at a networking event.

And that’s how I learned my most valuable lesson about online life: if you really want attention, don’t bother with originality, effort, or joy. Just say something vaguely critical on LinkedIn. Then duck.

Don’t Drink the Buttermilk: Data Governance in the Age of AI

When I was younger, my mother insisted that we label the shelves in the refrigerator. “Milk,” “Condiments,” “Leftovers.” It was a system designed to prevent catastrophe, or at least to keep my father from drinking bleu cheese dressing straight from the bottle under the assumption it was buttermilk.

I thought this was ridiculous. The milk knew where it was. Why not trust it to find its own way home?

Fast forward thirty years, and here I am, sitting in a meeting about data governance, explaining to a group of engineers why we cannot, in fact, allow machine learning models to drink directly from the “condiments” shelf.

“Why not?” someone asked, with the same incredulity I once had toward my mother’s fridge. “The data’s all there.”

Yes, but so is the ketchup, the horseradish, and that Tupperware of regretful lasagna from 2018. AI, left unsupervised, will happily eat it all and tell you with great confidence that the population of France is marinara.

Governance isn’t glamorous. Nobody goes into tech to write metadata policies or create retention schedules. They want to build robots that compose symphonies or tell jokes about dogs in French. But without the boring stuff, without the labels, the rules, or the grown-up supervision, you don’t get robots. You get chaos. And chaos doesn’t sing. It burps.

Data democratization, meanwhile, sounds far nobler than it is. “Power to the people,” we say, while handing everyone in the company a golden key to the database. It feels like Woodstock for spreadsheets: free love, free access, free analytics. But if you’ve ever watched a toddler try to pour milk from a gallon jug, you know what happens when you give freedom without structure. It’s not democracy. It’s a kitchen floor full of dairy.

The promise of AI makes this all more urgent, because AI is a very eager intern who lies. It will produce an answer to anything you ask, regardless of whether it has any actual knowledge, because its job description is “pleasing authority figures at any cost.” Governance is the uncomfortable adult in the room reminding everyone that the answer still needs to be right.

I sometimes fantasize about what it would be like if people treated their own lives the way they treat corporate data. Imagine your family photo albums scattered randomly across five different attics, basements, and glove compartments. Grandma’s birth certificate is in a shoebox labeled “Halloween Decorations,” and your high school yearbook lives in the freezer next to the peas. “Don’t worry,” you say, “we’ll let AI find it.” And then AI proudly hands you a picture of a cat in a pilgrim costume.

So yes, data governance is boring. It’s milk-shelf labeling, and it’s telling your overeager intern that, no, horseradish is not a population statistic. But in the age of AI, boring is the only thing standing between us and a world where business strategy depends on marinara.

And trust me, nobody wants that. Not even my father