Reading Kevin Kelly’s book (The Inevitable) motivated me to rewatch the 2002 movie Minority Report. The movie, directed by
Steven Spielberg, is based on a short story by Philip K. Dick. The year is
2054; an experimental unit named Pre-Crime has been operating in Washington
D.C. for six years and has reduced murders to practically zero. How did they do
it? They have precogs, individuals who have extrasensory perception (ESP)
skills allowing them to see murders before they inevitably take place. The
precogs are a “hive mind” linked to computers and data, and as one character
says in the movie: “Don’t think of them as human.” They rest floating in a pool
with as little to disturb them as possible. They are fed nutrients and their
cerebral activities are linked to Pre-Crime’s sophisticated computer system.
For a 2002 movie envisioning the future, Spielberg did a
fantastic job. The featured tech includes a combination of things I’ve written
about recently: virtual reality and augmented reality, connected to and drawing
from a huge database (the Internet!). Big data is combined with the visual
streams from the precogs, allowing the main protagonist played by Tom Cruise to
conduct a virtuoso investigation. As he waves his hands in the air, different
streams of information are constructed and displayed for the team. It’s like
watching a maestro conduct an orchestra. (The Iron Man movies borrowed heavily from this imagery.) While we are
still a long way off from 2050, we are well on our way to developing that sort
of tech.
The precogs provide one stream of data into a huge computing
database, that allows pinpointing the day, time, location, and actions in a
future event. One way to think about this is that with enough data, someone
observing past behavior could make reasonably good inferences as to future
behavior. If you didn’t do well on a pre-test, skipped class, didn’t take
notes, didn’t do the homework, didn’t study, then you’re likely not to do well
on the exam. Humans connected to the internet be it via wearable devices,
mobile phones, or sitting at computer terminals, are constantly giving up more
and more of their data to this database. A sufficiently strong A.I. just might
be able to piece together some aggregate future behavior based on past behavior
in comparison to a huge database from hundreds of millions of individuals. This
A.I. may even be able to assign a probability value from the pre-calculated
statistics.
This past week I received yet another spam e-mail from
ed-tech. I usually delete these without reading it, but since I had been
pondering the precogs, I decided to read the e-mail. This one was from Pearson
and was advertising its adaptive learning platform, Knewton, integrated into
Mastering Chemistry. I have some idea of how these systems work (I’ve mentioned
Knewton in a previous post); I read some of the initial literature surrounding
ALEKS before it became proprietary. Earlier this year I also read a white paper
(can’t find it now) taking stock of where we are with adaptive learning systems
in higher education.
I watched the short embedded video advertising Pearson’s product.
Not surprisingly, I didn’t learn anything new from a technology standpoint.
However what was more interesting was the sales pitch. The A.I. system gauges
your starting point by asking you questions (you solve chemistry problems) and
based on your mastery or lack thereof, the system shunts you along particular
pathways that are personalized for you. They have been personalized based on an
ever-growing training set – the big data and the analytics. Every choice you
make, key you press, even the time you take working on a problem, all that is
added to the database. If there was an eye-tracking system, I bet they’ll be
gobbling up that information too. In Minority
Report, that’s how advertising works. As Tom Cruise wanders through the
subways and the shopping mall, ubiquitous eye trackers scan him, personalize
his ads, and make suggestions.
It’s not a big stretch to “classify” students (based on
their performance in these adaptive learning systems) into categories of
academic competence. Pearson and other companies are already supplementing
other parts of student tracking and advising (this is a major push) and selling
their products to the Student Affairs part of the university. They are probably
going to make inroads into Career Services as part of their encompassing
strategy (maybe they already have!). Soon the system will be helping students
sort through potential careers based on academic potential, extracurricular
activities and any other information it will happily gobble up into its big data analytics algorithms. With more data, it can assign probabilities that a
certain individual will end up in a certain career. (It can probably already
assign an exam score probability.) Or make a probabilistic prediction that an
individual will take certain actions. Starting to sound like a precog yet? A
hive mind, perhaps?
In her third year at Hogwarts, Hermione elects to take
Arithmancy – mathematics that predicts the future. How about algorithms that
predict the future based on big data? I hereby invent the word Algorithmancy.
(You heard it here first!) I’ve speculated that Advanced Arithmancy is akin to
theoretical Physical Chemistry. Algorithmancy would be what I do as a
computational chemist – I apply theoretical models in physical chemistry to
make predictions on how ensembles of molecules might evolve taking into account
their intrinsic “chemistry” but also the “environment” they are in. I happen to
study the origin of life using computers, but might I also be originating life
in a computer? We call it artificial intelligence. But is “artificial” the
right word? Maybe it’s just a different kind of intelligence, one that we don’t
understand and seems alien to us even as it grows exponentially feeding on vast
networks of data. Alien intelligence may be the new A.I.
No comments:
Post a Comment