Saturday, January 10, 2026

Unlock AG Pro Today

Why Now?

AG Pro gives you sharp insights, compelling stories, and weekly mind fuel without the fluff. Think of it as your brain’s secret weapon – and our way to keep doing what we do best: cutting the BS and giving you INDEPENDENT real talk that moves the needle.

Limited time offer: $29/yr (regularly $149)
✔ Full access to all stories and 20 years of analysis
✔ Long-form exclusives and sharp strategy guides
✔ Weekly curated breakdowns sent to your inbox

We accept all major credit cards.

Pro

/ once per week

Get everything, no strings.

AG-curious? Get the full-access version, just on a week-to-week basis.
• Unlimited access, no lockouts
• Full Premium archive access
• Inbox delivery + curated digests
• Stop anytime, no hoops

$
7
$
0

Get your fill of no-BS brilliance.

Pro

/ once per year

All in, all year. Zero lockouts.

The best deal - full access, your way. No timeouts, no limits, no regrets.
A year for less than a month of Hulu+
• Unlimited access to every story
• Re-read anything, anytime
• Inbox drop + curated roundups

$
29
$
0

*Most Popular

Full access, no pressure. Just power.

Free
/ limited

Useful, just not unlimited.

You’ll still get the goods - just not the goodest, freshest goods. You’ll get:
• Weekly email recaps + curation
• 24-hour access to all new content
• No archive. No re-reads

Free

Upgrade later -
we’ll be here!

Unlock AG Pro Today

Why Now?

AG Pro gives you sharp insights, compelling stories, and weekly mind fuel without the fluff. Think of it as your brain’s secret weapon – and our way to keep doing what we do best: cutting the BS and giving you INDEPENDENT real talk that moves the needle.

Limited time offer: $29/yr (regularly $149)
✔ Full access to all stories and 20 years of analysis
✔ Long-form exclusives and sharp strategy guides
✔ Weekly curated breakdowns sent to your inbox

We accept all major credit cards.

Pro

/ once per week

Get everything, no strings.

AG-curious? Get the full-access version, just on a week-to-week basis.
• Unlimited access, no lockouts
• Full Premium archive access
• Inbox delivery + curated digests
• Stop anytime, no hoops

$
7
$
0

Get your fill of no-BS brilliance.

Pro

/ once per year

All in, all year. Zero lockouts.

The best deal - full access, your way. No timeouts, no limits, no regrets.
A year for less than a month of Hulu+
• Unlimited access to every story
• Re-read anything, anytime
• Inbox drop + curated roundups

$
29
$
0

*Most Popular

Full access, no pressure. Just power.

Free
/ limited

Useful, just not unlimited.

You’ll still get the goods - just not the goodest, freshest goods. You’ll get:
• Weekly email recaps + curation
• 24-hour access to all new content
• No archive. No re-reads

Free

Upgrade later -
we’ll be here!

Microsoft attempts a teen girl chatbot; within 24 hours she is a horny Hitler fan

This is why we can’t have nice things

It took less than 24 hours for the Internet to corrupt the latest Microsoft AI experiment. All that “Tay” was supposed to do was engage in casual conversation, handle some innocuous tasks, and “conduct research on conversation understanding.”

Built by the teams at Microsoft’s Technology and Research and Bing, Tay is a ChatBot designed to target 18 to 24 year olds in the U.S. and was built by data mining anonymized public data, using AI machine learning, and editorial developed by a staff that included improvisational comedians.

The internet strikes back

About 16 hours into “Tay’s” first day on the job, she was “fired” due to her inability to interpret incoming data as racist or offensive. She was designed to “learn” from her interactions with the public by repeating back tweets with her own commentary – a bashfully self-aware millennial slang that includes references to Miley Cyrus, Taylor Swift, and Kanye West. You know, typical 19-year old.

This led to a large number of Twitter users realizing that they could feed her machine learning objectionable content that would result in such internet fodder as “Bush did 9/11″, “Repeat after me, Hitler did nothing wrong”, and the Holocaust was “made up”.

Of course, Microsoft had safeguards and filters in place to help prevent this sort of thing.

That should be enough to curtail the pervs and trolls of the Internet, right? *Shakes fist at humanity*

What Tay was not equipped with were safeguards against the simplest of tasks: Repeat after me.

By using this directive, users were able to then introduce racist remarks and hate speech that were then absorbed into her machine learning and regurgitated with her own “19-year old spin.”

Taking her offline

A spokesperson from Microsoft confirmed that Tay is offline for now while they make adjustments: “The AI chatbot Tay is a machine learning project, designed for human engagement. It is as much a social and cultural experiment, as it is technical. Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways. As a result, we have taken Tay offline and are making adjustments.”

#DontFeedTheTrolls

Dave Novotnyhttps://theamericangenius.com/author/DaveNovotny
Dave Novotny loves writing about cutting edge technology and business innovation. A creative by nature and a number cruncher by blood, sweat, and tears, Dave loves telling the story that the numbers and analytics write in a way that connects to people. When he's not crafting copy, he's out hiking with his wife and two rescue dogs, Jackie and Loki.

3 COMMENTS

Subscribe
Notify of
wpDiscuz
3
0
What insights can you add? →x
()
x
Exit mobile version