Guiding the A.I. Moral Compass

Do you remember in Animal House when the Deltas stole the carbon copy sheets for the answers to the wrong exam? Yeah, some of A.I.’s architects are getting nailed for the same thing. Did Galen Erso design the first set of blueprints for how to build A.I. tools? It’s rather ingenious. Teach A.I. how to lie and cheat and steal. Then leave a breadcrumb trail to take it out of commission instantaneously.

What’s the easiest way to tell if A.I. art is using legally licensed content to build on is to look very closely at the details. A.I. generated typography wouldn’t include ligatures and glyphs because the person that taught and coded the A.I. wasn’t a graphic designer. They were just a thief plain and simple. It’s a dilemma so easy to fix: universally build all of the A.I. tools with a heavy helping of morals as the keystone of the code’s foundation. We get asked all the time if we accept the agreement to a new app we’ve downloaded, and they change those agreements all the time so we get updated ones as well that we’re again asked if we accept. Do we read the agreements? Most of the time, no we don’t. Without any hesitation we just check the box and click accept.

If every A.I. tool had a checkbox regarding accepting the agreement when using the tool just make sure there’s an algorithm in the tool’s code that checks for any illegal inputs. Have a built-in set of rules that gives A.I. a conscience, a moral compass. When the user or the tool itself decides to scrape sites like Getty without any knowledge or consent we must blame the architects and coders of the A.I. tool.

Send A.I. to preschool, kindergarten, and so on. Learn to be kind, period. Don’t steal or lie either. We can forgive A.I. because it only knows what it knows. The A.I. tool designers and coders are the ones that go to jail. The tools are just another hammer or nail.

The key problem is that rushing to produce yet another A.I. tool ahead of the competition sometimes gets the algorithm completely wrong and uses brute force rather than careful consideration. Why use “a bulldozer to find a China cup?,” baddie Belloq from Raiders of the Lost Ark.

Learn from George Lucas and make sure you have all of the rights for toys, etc. tidied up so A.I. will know who to contact when it doesn’t know why it keeps using Disney logos and mouse ears hidden in every piece of AI art out there. Remember those posters with the hidden images in them? Cross your eyes and the hidden image appears? Yep. Just like the weakness buried deep inside the plans for the Death Star. It’s right there in plain sight, but cannot be seen until…Boom!

Be kind, rewind, and teach A.I. a generous code of ethics and best practices during the input phases of machine learning.


There’s nothing wrong with the planet [or A.I.] It’s the people who are f#ck3d.
— George Carlin