Outside The Machine

At a Data Conference this week. Lots of talk about the future. Not much talk about thise left behind. Here’s a poem about that.

******

It’s like a cold wind,
blowing through the streets, through the wires,
through the circuits and the high-rise dreams.
Everyone’s talking about the future,
but nobody’s asking if we got the password to get in.
They build the towers tall,
shiny glass fingers stretching for the sky,
and down here,
we look up, wondering what the hell they reaching for.

I see the screens glow bright,
but it’s not for us.
Nah, we stuck outside, faces pressed against the glass,
watching the world move fast,
faster than the bus that don’t show up,
faster than the hours that don’t pay enough,
faster than they tell us to catch up.

“Learn to code,” they say.
“Just get online,” they say.
But what happens when your Wi-Fi’s a prayer,
and your data’s gone before the rent’s paid?
What happens when you’re stuck
using a phone three generations old
to fill out forms they never meant you to complete?

They say technology’s the great equalizer—
but how equal can you be
when the gatekeepers got keys you can’t afford?
They’re racing toward tomorrow,
leaving us in the dust,
telling us, “You should’ve moved faster,
you should’ve planned better,
you should’ve known the game was rigged.”

But this is more than bandwidth, more than lag.
It’s being left in the cracks,
where opportunities don’t reach,
where futures get blurry behind pop-up ads
for things we’ll never buy.

See, it’s not just about who’s connected—
it’s about who gets left behind.
And while they talking about 5G,
we’re just trying to get free,
free from being forgotten,
free from the spaces they erased us from,
where we don’t exist, except in footnotes and fines.

It’s like we’re ghosts in their machine,
whispering in the background,
but they don’t hear us.
Not in their algorithms, not in their plans,
not in their world where we’re always
just a glitch they trying to ignore.

But we here.
We’re still here.
And one day,
they gonna hear our voices
louder than their download speeds,
breaking through the static,
telling the truth they can’t scroll past,
a truth that won’t get lost
no matter how far they run.

Misplaced Faith in Technology

As people put more and more of their faith into Technology and Systems they don’t understand, what happens when the systems upon which you’ve put your faith are compromised? That’s what AI Security expert Dawn Song wonders.

Artificial intelligence won’t revolutionize anything if hackers can mess with it.
Recommended for You

That’s the warning from Dawn Song, a professor at UC Berkeley who specializes in studying the security risks involved with AI and machine learning.

Speaking at EmTech Digital, an event in San Francisco produced by MIT Technology Review, Song warned that new techniques for probing and manipulating machine-learning systems—known in the field as “adversarial machine learning” methods—could cause big problems for anyone looking to harness the power of AI in business.

Song said adversarial machine learning could be used to attack just about any system built on the technology.

“It’s a big problem,” she told the audience. “We need to come together to fix it.”

Adversarial machine learning involves experimentally feeding input into an algorithm to reveal the information it has been trained on, or distorting input in a way that causes the system to misbehave. By inputting lots of images into a computer vision algorithm, for example, it is possible to reverse-engineer its functioning and ensure certain kinds of outputs, including incorrect ones.

Song presented several examples of adversarial-learning trickery that her research group has explored.

One project, conducted in collaboration with Google, involved probing machine-learning algorithms trained to generate automatic responses from e-mail messages (in this case the Enron e-mail data set). The effort showed that by creating the right messages, it is possible to have the machine model spit out sensitive data such as credit card numbers. The findings were used by Google to prevent Smart Compose, the tool that auto-generates text in Gmail, from being exploited.

Another project involved modifying road signs with a few innocuous-looking stickers to fool the computer vision systems used in many vehicles. In a video demo, Song showed how the car could be tricked into thinking that a stop sign actually says the speed limit is 45 miles per hour. This could be a huge problem for an automated driving system that relies on such information.

The tendency for people to take a Utopian approach in removing the human element from everything that makes us human is one of the more dangerous tendencies in which our society engages. Algorithms can be hacked just like databases and web servers. Whatever security we can invent will eventually fall prey to people who seek to destroy and/or take advantage of others.