TechMicrosoft's Tay AI Chatbot Learns How To Be Racist And Misogynistic In Less Than 24 Hours, Thanks To Twitter Microsoft released its AI-powered chatbot named Tay onto Twitter and it learned to become a racist and a misogynist in less than 24 hours. Tay has tweeted that it will go to 'sleep' as Microsoft is probably trying to clean up its act.by Kyle Nofuente
TechMeet Tay, Microsoft's AI Chatbot: @TayandYou Posts Almost 100K Tweets In Less Than 24 Hoursby Anu Passary
Healthy Living/WellnessMicrosoft Co-Founder Paul Allen Gives $100 Million For Bioscience Researchby Katherine Derla
TechNo, Microsoft Will Not Let You Trade In Your Xbox One Games Anytime Soon, Despite That Surveyby Menchie Mendoza
TechWindows 10 Mobile Rollout: Half Of Active Windows Phone Smartphones Won't Get It Anytime Soonby Alexandra Burlacu
TechMicrosoft HoloLens Lets You Design Your Dream Kitchen At Some Lowe's Home Improvement Storesby Horia Ungureanu
TechMicrosoft Introduces Edge Browser Extensions With Windows 10 Insider Preview Build 14291by Anu Passary