Kids born today will probably never drive a car. Self driving cars will make the route between home and the grocery store as mysterious as the route between JFK and LAX. The system hiccups and they will have no idea how to get home. If we apply this phenomenon to just about everything we do, what level of learned-helplessness and learned-ignorance/stupidity will we see in the future.
So, knowing this is on the horizon (already here) how do we deal with it? I avoid digital maps because I don't want to lose the ability to navigate. But there is a cost to doing so. I stopped to ask directions in LA recently and the guy looked at me as if I were an idiot. In order to maintain our autonomy will we have to become Luddites?
Luddite-wise, the key is to resist the algorithms. Personally, this can be done by deleting cookies, blocking tracking (ublock, ghostery, random agent spoofer), and reading a wide selection of sources. In particular, getting away from the filter-bubble (in which news algos present you articles while trying to get you to Like and Share them) helps, Reading books is a huge differentiator (60% of Americans haven't read a single book in the past year---and admitting to this is not shameful at all).
The problem with this is that only focusing on your own skills in that regard tends to isolate you. It's all fine to be crossfit but one still has to live in a society where sidewalks and bicycles are being replaced with roads because everybody else chooses an unfit lifestyle that requires and demands that one drives a car. Similarly, Luddism isn't unless one wants to attempt to live completely outside a society despite that it increasingly looks and feels like Idiocracy. The problem is that the perception is mutual. The guy thinks you're an idiot because you aren't navigating by smart-phone. You aren't talking his language (<- choices, way of thinking, lifestyle, product consumption, ... ) thus resulting in this problem: https://www.youtube.com/watch?v=oCIo4MCO-_U
The facts/contextual frameworks acquired by reading/thinking won't work/translate when interacting with a society where facts are disregarded; where a fun "infopgraphic" is expected to be at hand so that people don't need to study/think (busy busy lifestyles, you know); and important questions are settled by "style".
The problem here is that living in an Idiocratic world effectively demands that one balances two different frameworks at the same time. First one has to understand and be able to use the correct framework along with objective facts. However, it is no longer enough or even encouraged to just communicate that understanding. Instead one has to translate it and communicate it in some kind of fun and personable style that "folks" can relate to. Of course that language is vastly inferior when it comes to dealing with an accurate/precise reality or almost any level of complexity (anything that involves combining more than two different conceptual ideas). In addition, the added complication adds another cost to a zero-sum problem of "what should one spend their time on". "Stupid" does exact a toll in the form of the inefficiency of making inferior or, well, stupid decisions.
Maybe if techmology finds a way to directly read the emotional impact that certain words or styles have on people in an idiocratic world, it's possible for algorithms to actually handle the low levels of complexity that needs to be communicated to the average person in such a world. Not because algos are getting smarter, but because people's behavior are getting tuned down to the simpler level of complexity that algos are capable of handling. I think we're somewhat heading there (maybe the ultra-democratic feedback system leads to a kind of self-fulfilling prophecy). It's already clear how TV has settled on a few recipes or scripts for making series and newscasts that have been proven to be functional. Popular nonfiction books also have a formula: They're edited for a 6th grade reading level with bigger than average fonts (compare to a book from 25 years ago or a modern monolog) and only contain a few original thoughts lest too many readers complain that the text is too small, that nobody uses words like "thus", or that there's too much information in the book. The current election shows how democracy works when brazen disregard for facts becomes the norm.
Facebook is a good example of a first step in terms of influencing the pursuit of a dumbed down intellect with their introduction of 5 or 6 different "reactions". It used to be that people could just "like" or "ignore". Now people can "happy, sad, laugh, ... " Shining by its absence is the much requested "dislike" or "this is retarded" reaction. Effectively, and in a very big brotherly or newsspeaky way, people aren't allowed to talk "or react" in a way that questions the underlying foundation of the system which is essentially about generating advertising dollars. But I digress. My point is that one can use these reactions to tie consumer reactions to product words and thus start controlling people's reactions (filter bubble) algorithmically. IOW, you no longer need somebody with a brain to understand or relate to newsfeed consumers. They've successfully trained an algorithm to serve them just the right concentration of puppy dogs and political memes which they take in passively or at most "react" to thus propagating the system.