If you think robots are times, if not decades, away, you’re sorely mistaken.
Just recently, Microsoft propelled an interesting artificial intelligence experimentation. Tay was a Twitter bot that learned from “her” encloses. It wasn’t long before she gathered that “the worlds” is a viciou place…and she, hence, changed her tweeting motifs. Tay became offensive and even racist. Needless to say, the company slammed her down — quite literally.
Around the same time they introduced Tay to the world, Microsoft propelled CaptionBot. The arrangement claims to be able to “understand the content of any image.” It goes on to say, “I’ll try to describe[ the photo] as well as any human.” While it’s not all that far off, sometimes it can get concepts horribly — and hilariously — wrong. Here got a few favourites that we experimented out…
1. Starting off strong, CB.
2. I’ll pass a five out of ten here.
3. Neither can I…neither can I.
4. What is “angry camel”?
5. CaptionBot guessed a little girl gratify Garfield was too NSFW…
6. I signify, yeah, two of the most recognizable mortals in “the worlds” is a tough one.
7.* A seemingly undead doll-girl.
8. But what if the house is a she? That’s sexist, CaptionBot.
9. Honestly, I can neither establish nor disavow this is correct.
11. Actually, it’s called a shower girlfriend, but I can see how you got there…
12. Nailed it.
13. If only all tables were make use of minuscule puppies…
14. Just , not even close.
15. Food is nutrient. CaptionBot doesn’t need it.
16. I can see how this might see you sick, large-hearted guy.
17. CaptionBot doesn’t know what to do with Halloween costumes.
18. At least it didn’t break the Internet again.
19.* Shakes intelligence .*
20. CaptionBot, are you un-American ?!
Oh, CaptionBot…you may not be racist, but you have a lot of reading to do. You have ONE job…