It was only a matter of time.

London-based IntelligentX Brewing Company announced plans in 2016 to leverage artificial intelligence (AI) to brew beer. Presumably better beer, but we will have to wait and see.1 Starting with four base beer stlyes (Amber, Golden, Pale, and Black [stout?]), targeted consumer preference questions and Artificial Brewing Intelligence (ABI, read abbey, like the beer making monks; get it?) the company aims to leverage technology to more quickly and fully match beer batches with people’s tastes. Feedback questions are descriptive, explain what the terms used mean to better educate the drinkers and more accurately inform the algorithm, and can be done via smart phone. The combination of mobile, social,2 data, and craft beer is ingenious for having overlapping and energetic demographics. Through direct user input, at or near the time of consumption, the brewery will better be able to cater to customer preferences.

Data and, more broadly, research is absolutely a benefit to problems of all sorts, consumer products included, especially one that targets our senses so directly. Applying technology, data, and statistics to beer is no new thing, there is a legacy going back at least to the industrial revolution, the Student’s-T test at Guinness, and extending through to the logistic and marketing efforts of the Big Boys. Getting customer input is a brilliant step to cutting down the feedback loop. Instead of running focus groups and scraping online reviews you have here a direct channel where you can ask targeted questions that immediately inform future development.

Craft beer is growing, data is proliferating, mobile is everywhere. The marriage of these things seems inevitable, by accidental market forces bumping into one another (who invented the bicycle after all?), if not by intention. There is a, “it makes sense” angle. The use of data can be a story in itself. Beer is always a fun thing to cover. Together they make a catchy idea and garner a little coverage. That is all for the good. General news coverage of craft beer likely increases the chance of intriguing casual beer drinks to try something more interesting.

abi
Data Source: Bloomberg

Runnnig headlong into a technological solution without a clear idea of purpose is likely to get us where Thoreau warned, inventions that are but improved means to an unimproved end. Is the goal the best beer for a sole consumer, a demographic, the mystical average person or something for a company to hook us into buying? What about exploring and learning on our own?

To better tackle several of these issues we will leverage Neil Postman’s seven questions of technology.

1. What is the problem to which this technology is a solution?

Casting as wide and generous a net as possible while focusing on the beer drinking experience we can come up with several problems that this technology may address. The technology would provide a topic of conversation when one needed something to talk about. It might provide a sense of importance at being asked for input by the brewer. The process of drinking, thinking, and questionnaire clicking engenders product engagement. At a time when one has only a few thousand (million?) distractions at arm’s length there is space yet for another excuse to bring out our phones and break eye contact with the humans in our vicinity.

The problems for the brewery are more concrete. There is the trouble of putting out several beers and not knowing which will take off, if any. Getting noticed in a busy field is always a concern. Allowing for immediate customer feedback provides a built in excuse for a less than stellar beer. The very blandness of a beer only reinforces the story of providing a blank canvas from which to build on together. This togetherness of brewery and drinker ties in nicely with the aforementioned customer engagement and follow-up curiosity that could lead to repeat purchases and/or word of mouth marketing. Additionally, any automated component has the potential for cost savings, whether by cutting back on labor, introducing efficiencies, or both. Whatever the upfront expenses for creating and running the algorithm they would be made up for by no longer needing to hire as many, as sophisticated, or as dedicated (fulltime v partime) brewers as before.

The benefits of making customers think more critically about their beer aside, this approach to matching drinker preferences to potential beers only gets in the way of bringing these two together. Drinkers could just pick up another of the 1,000+ beers available in their area and seasonally rotated in, as opposed to waiting for the next algo-infused batch. The drinker will always be quicker at iterating the experience than any brewery, at least until we have matter compilers. This technology does not address a problem for the drinker,3 which leads us to ask…

2. Whose problem is it?

Finding the right beer is an ongoing quest, something enjoyable in its own right. There are thousands of breweries with tens of thousands of beer options across 100+ beer styles. The count of each of these three grows yearly, proof of the creativity and explorative nature of the craft beer community, drinker and brewer alike. With so many choices (information glut) the drinker can get lost but, as with anything else, the circuitous nature of discovery is part of the learning process, something that will help develop a foundation for the subject. If a recommendation is desired one may be asked of the bartender, the store clerk, or friends. Each of these options raises the opportunity for a conversation that could lend additional insight and context than “people who enjoyed this beer also drank these.”

Drinkers might pick up some beer history, information on the brewing process, personal anecdotes, brewery stories, ideas about the establishment, as well as get a better handle on their friends’ preferences and tastes (or lack thereof). Conversation allows for give and take, the best form of engagement and feedback yet invented for our species, in part inventing our species. Additionally, seeking out the answer provides the backbone for knowledge as opposed to having efficient, decontextualized information simply handed over.

The other edge to all the options is the problem of standing out, consistently and persistently. A quality product, with good distribution, and exposure are still some of the best ways of accomplishing this. Gimmicks work if backed by reliable characteristics but on their own are not built for sustained repeat purchases. The use of machine learning to create a better beer is a problem for the brewer not the consumer. To the extent that beer recommendations of a tech nature are desired drinkers may turn to online resources, apps, and forums for guidance.

3. Suppose we solve this problem, and solve it decisively, what new problems might be created because we have solved the problem?

The first thought, as with so many technologies nowadays, is job displacement. Whatever head brewer and supporting staff are needed can be reduced, perhaps even replaced. Short of removing the humans there remains the threat of deskilling, only retaining work that is a pastiche of tasks without any logic to their responsibility other than being algorithm leftovers. This is not human logic but machine logic.

In the rush and enthusiasm to automate processes we have unearthed a set of unexpected issues that are most generously described as ironic. Automation is implemented for a variety of reasons, many of them valid, some wishful: from quicker, more reliable, consistent, and predictable responses to removing human error, bias, and/or expenses. Crucially, things do go wrong. Depending on the importance of the process businesses/governments wish to keep a human-in-the-loop as a safety precaution for the unexpected and unlikely. The irony par excellence is that “the more advanced a control system, the more crucial the human operator” (IOA). This would be humorous but for the accompanying problems.

Designers of automated processes unintentionally introduce two sources of new trouble. First, there are the designer errors themselves which are a “major source of operations problems” (IOA). Second, whatever is left out of the automation is left over for the human elements, to either monitor or takeover. Essentially, these left over responsibilities may be disparate tasks with no rhyme or reason, showing no consideration or respect for the work of the human.

Within the remaining responsibilities of human operators, monitoring and takeover, we may review the cognitive and manual expertise needed. The major challenge in either case is a disassociation, distancing and deskilling of the operators; transforming a previously expert professional into a non-expert.

In order for an operator to take effective and rapid control of a misbehaving process they must make a step change: one moment they are attentive to something else, they may or may not be the one monitoring the process, and the next they are in the position of taking control. Depending on how infrequently the operator is actually using their skills we can expect a deterioration in ability (deskilling). Ironically we have made the prospect of takeover a more complex issue than if an operator were manually in charge at all times. What is required at the time of takeover is someone with more expertise, not less, but the system fosters the latter (IOA).

In the instance of monitoring it has been shown that humans struggle to stay attentive to infrequently changing inputs for longer than 30 minutes at a time. The automated system requires supervision that a human is ill adept at providing. Besides issues with inattention and anomaly identification we face much the same trouble as with takeover. An operator’s knowledge is dependant on the frequency of use. It is with this repetition that the operator collects information about behavior, feedback, predictions and outcomes. An expert does not simply have raw data about a scenario in mind but rather an intricate web of results coming from predictions and decisions made (IOA). Any one moment is but a snapshot and taken on its own, without context, an arbitrary construction that is inadequate to describe the events that led to the situation.

The removal of human expertise will mean a greater reliance on the tools and fewer in house brewery authorities who can tell how and why their beer works from a taste perspective or alternatively when a batch is off. You would have created a brewery without a brewer. Should things go wrong with the recipe, the preparation, the storage, or countless other things on the way to the customer the business would not have the in-house intelligence to target and address the problem.

This lack of internal expertise radiates outward when you imagine the implementation of such technology to other firms. By displacing additional staff, perhaps holding on to the head brewer alone or some other qualified (enough) persons to serve as the human-in-the-loop the industry may ultimately undercut the number of next generation brewers, as well as their quality.

We might be giving this one technology undue significance but we can easily imagine, and increasingly do not have to as “AI” implementations roll out, diverse applications of AI and high tech “solutions.” Naturally we can say this is part of the point. AI will replace brewers, cut down on positions, and we will need fewer staff per barrel. However, beyond the reduction of people we are potentially removing our understanding of the process by undercutting the apprenticeship inherent in the job training. Implementing this technological solution could set back and potentially suspend our ability to create new and delicious beer for as long as we rely on it.

4. Which people and what institutions might be most seriously harmed by a technological solution?

More than a hint was given that the employees of the brewery would be harmed, specifically the brewing staff, though the trouble would extend to all who rely on the technology and have to get involved in the cleanup of an algorithm on the fritz. This could mean dealing with bad tasting and spoiled beer or losing a job as a result of a “brewer” who had no ideas or insights into what would help make a beer more attractive from a taste and community connection perspective. The loss of an invested and motivated human would also cut down on the amount of internal resources available to the brewery.

How will we empower intelligent disobedience if we are all just dial readers?!

Breweries might lose further control or power by relying on an algorithm that is not of their devising. We can look to the plight of farmers worldwide at the mercy of agro business who hold patents, not only on the pesticides and fertilizers sold but the seeds themselves. These businesses have managed the chilling task of claiming intellectual property on mother nature. It would hardly be surprising to see the same control by the provider(s) of a techno-brewing service and lest we believe that breweries might tinker under the hood so as to add some secret sauce we can look at the struggle of right to fix advocates with respect to smart phones, John Deer tractors, and all the rest to get an idea of the challenges and restrictions to be faced when the things we purchase are not actually owned by us (The End of Ownership).

With brewing under intellectual property control, sold as a license to be used (we may speculate taking inspiration from current software license agreements [SLA] or end user license agreements [EULA]), how might breweries hope to differentiate themselves? Numerous variables go into the completion of a beer but it is hard not to see these breweries as effective franchisees doing all of the hard work while the home office takes a sip out of the profits for each barrel.

Moreover, data collection brings its own serious issues with respect to customer privacy. Shouldn’t our pints come without the surveillance concerns?

5. What changes in language are being enforced by new technologies, what is being gained and what is being lost by such changes?

The meaning of brewer and brewery is ever so slightly altered by accommodating responsibilities to technology. To varying degrees of how successful the algo implementation is a brewery could come to mean exclusively the machine that makes the beer, or some other colloquial label that obscures thos mechanical nature. This may seem fanciful now but remember that computer was originally a person that computed math equations in an office pool for government, university and corporate firms. A total displacement of meaning would make it nearly inconceivable for future generations to associate anything but an algorithm and machinery for what it takes to develop and sustain beer styles, or any of the adjacent industries that could import such a technology.

One of the founders references coffee, chocolate, and, his favorite, perfume (I’m sure it is, sweet boy). The brewery as well would have its properties change. Removing humans and mechanizing further would only further industrialize the process, sterilizing a process that at least on the small batch size took pride in the craft moniker. We rarely look at laboratories as instances of craft and I doubt we would eventually treat these sort of breweries any different. Perfume may in fact be an excellent parallel. Here is a product that is under the control of large firms making a chemical product that is primarily water and marketing it as an aspirational good. Each of these changes remove the idea of the human role and bring further to the fore the process itself, devoid of any human creativity or associated livelihood.

Changes in language resulting from technology are a natural segway into bringing up the Luddites. For much of its history the term “luddite” has carried a negative connotation and is typically invoked as an insult. Though the term’s origin is a bit of a mystery, the movement to which it points is known, taking place in England from 1811-1816 (LR). In our day the term refers to people who are anti-technology. Further, the term indicates people who are backward, somewhat unitelligent, and resistant to natural progress, an inevitability that is silly and even possibly dangerous for us to resist. This same term was originally used during the movement with pride. Clearly the word had other meanings.

One of the first misconceptions about the original Luddites is that they were anti-technology. This is far from the truth. The Luddites targeted and destroyed machines because of what they represented. Machines as stand-ins for other factors should clue the reader to a second misconception, that the Luddites were unintelligent or unreasonable. This point of view also needs mending. In a time when unionization was illegal, job security non-existent, and wages pitiful to the point that families enlisted their children to work in factories alongside them in the early parts of the Industrial Revolution, the machine was seen as symbolizing a threat to their already precarious existences (LR).

These skilled and semi-skilled workers were all too aware of what increased technological breakthroughs would do, especially as they accelerated. Note that the Luddites were threatened not only by new machines but also existing machines put to new uses. Would not some sort of resistance be the most rational action?

Postman: “only a fool accepts new technology unquestioningly.”

Language and thought influence one another and when we abdicate our selection of words to common and vapid phrases we give up more than just word choice, we give up an opportunity to think for ourselves. The latter is a good in itself that should require no further defense. To the extent that we passively take on the vocabulary and world view of tech proponents we are equally liable to giving up an opportunity to think for ourselves. This is dangerous in that no technology is neutral. Technology provides an empowering. From power relations fall out winners and losers. Displacements arising from such shakeouts result in real human pain.

It is important that we do not simply onboard the language spoon fed to us, swallowing it down without a thought to its taste. Terms and phrases such as “Friends”, “Like”, “Share” and “Gig Economy” are cuddly euphemisms used by powerful entities with outsized influences that should not be taken for gentle allies. Take just the last phrase as an example. A different, perhaps more honest, way of putting the same service is how it is marketed in India: Get My Peon. To speak clearly requires clear thought and vice versa. It is not easy but it is within our control and one of the best ways of empowering ourselves (PEL).

6. What sort of people and institutions acquire special economic and political power because of technological change?

As consumers it is likely that many will cede authority to the machine. Does the beer taste bad? It must be our undeveloped palate not the flavor deafness of the algorithm that could be incorrectly calibrated. Far too often the appearance of math and complexity provide a cover from scrutiny. Seeing a sophisticated apparatus we naturally assume that it “knows” better than us, forgetting that the underlying structure, mechanization, and algorithms are codified logic based on human derived assumptions, choices, and values. Should any aspect of that built apparatus or its software be inappropriately configured we would be deferring to a faulty tool (WMD).

There is also the concern of only a handful of players having control of the brewing hardware and software (the money isn’t in the copiers but in the ink). They may not acquire household name recognition but a look at smart phones, cars, on-line retailers, search engines, or any industry of the reader’s choice, especially the weightless economy, will show a disproportionate concentration of market share and value going to a few firms.

In the beer industry itself, four to five global conglomerates account for a majority of the beer volume and revenue. The tentacles of these “breweries” have reached across the globe, control aspects of brewing all along the value chain, from hop farms to distribution, have shown themselves adept and sophisticated at acquiring specialty brands, infiltrating foreign markets, leveraging information technologies, and purchasing beer related sites (ABInBev and RateBeer), how much more attractive to own and control the system other breweries may become reliant on.

7. What alternative uses (media) might be made of a technology?

The inherent biases, limitations, capabilities, and built-in instructions of how to use this technology are perhaps not all fully clear, at least that is my hope.4 We are not looking to cut off innovation for the sake of being contrarian or ignorant of past benefits of science, technology and statistics in brewing (see chemistry, refrigeration, transport, Student-T). Rather we may have overriding obligations to ourselves and our values than to a technology (TN).

There is always the honorable field of education to point to. This tool could be implemented as customer education media. Information provided by the consumer could be responded with other beers to try, foods to pair with, or beer history to consider.

To avoid a concentration of power and reduced freedom from the technology being used as intended the recipes could be placed in the creative commons.

Tool Using, Not Used by Tools

A simple thought: automate what is possible, hand it over to machines/algorithms, and let the humans do the rest. This is sold as a labor saving approach that benefits employer, employee, and customer alike. Predictably the greatest benefit to the business has to do with the bottom line, which is practically the only consideration and we are foolish to believe/think/expect otherwise, regardless of the ethics of behavior following exclusively from such a motivation (let us set Miton5 aside for now).

Automation is meant to be a boon to employees. By off loading repetitive tasks the worker is freed to focus on higher level tasks. This is the concept in general and the logic holds theoretically, occasionally in practice as well. Far more interesting and frightening are the examples where things do not go as expected. In a search for ever greater “convenience, speed, and efficiency” we are turning our jobs into nothing more than glorified computer nannies (ACL).

Automation is no longer just about mechanical displacement of physical bodies. Nowadays computers are a part of all our lives and their intrusion can be seen in white collar work: doctors, bankers, architects, attorneys, and airline pilots to name just a few. Aside from the conveniences afforded us the introduction of computers into our work affects “how we act, how we learn, and what we know” (ACL). This added layer provides one more remove between the worker and her work not infrequently resulting in deskilling: pilots who forget how to fly in emergency situations, doctors who do not diagnose, and architects who cannot draw.

Balancing out the short term augmentation of our abilities via the introduction of technology and automation is the long term pernicious effect of deteriorating skills. It used to be that the “distinct, well-defined, and repetitive tasks” automation was limited to lightened our physical load (ACL). Now the same approach to cognitive tasks is making our work disjointed and narrowing our focus. Ideally our new found freedom would allow us to pursue higher level pursuits that computers cannot do (yet?!).

The breaking up of our work into ever more discrete components that can be pushed onto machines changes not only the tasks but the people, their roles, attitudes and skills. We became disconnected from the surroundings and ourselves, becoming regulated to just another cog in the process, and a not very good or well used cog at that. Among other deficiencies that become salient in our new roles are those of artificial complacency, where we are lulled by computers into a false sense of security, and automation bias, “plac[ing] too much faith in the accuracy of the information” provided on computer displays (ACL). Both are manifestations of deferring authority and discounting our autonomy. We are proceeding with a Faustian bargain whereby we handover our ability to grow in return for convenience, but also inherent fragility. Learning is often an inefficient process that requires many passes before it sticks, something recognized as the generation effect.

We often categorize these newly revealed issues as human shortcomings and think up ways to relieve the human-in-the-loop. The idea of total automation comes to mind. Some automation is the problem, no doubt because it is incomplete, more of the same would bring us to a solution. However, “no machine is infallible” and as systems become increasingly more complex the potential areas of failure increase exponentially. Other solutions look to alter the role of automation so that it hands back control at “frequent but irregular intervals,” thereby limiting the scope of machines and relieving humans from the role of observer only, something we do poorly due to to limited attention spans and other observational shortcomings, and add in educational components that constantly test workers and help them retain their skills.

We must stop categorising these as human shortcomings but rather as failed technical approaches. There are two reasons for this. The first is heady and related to human freedom, dignity, rights and wages to name just a few lighthearted topics you can joke about with your crazy uncle over Thanksgiving dinner at no risk of ruining an evening. The other is more practical, machines may be cobbled together in various ways to suit our needs, indeed this is one of their great strengths. It would be the pinnacle of idiocy to devalue the strengths of the most powerful learning system we have, our brains, for the sake of accommodating them to the needs of computers. Far wiser to maximize the capabilities of the human mind and to build enhancing technologies around this. Otherwise we risk having stupid humans running stupid machines.


Notes

1 And better how is an important qualification to keep in mind.
2 The use of Facebook Messenger is clearly seen in one video.
3 “Not solutions to any problem a normal person would regard as significant.” (Postman lecture)
4 Not clear which would be more disappointing, that the uses proposed are all there is to this algorithm or that we can not see alternatives.
5The business of business is business.”

ACL: Carr, N. (2013) All Can be Lost Available at: The Atlantic

IOA : Bainbridge, L. (1983) Ironies of Automation

LR : Linton, D. (1985) Luddism Reconsidered

PEL : Orwell, G. (1946) Politics and the English Language

TN : Postman, N. (1992) Technopoly

WMD : O’Neil, C. (2016) Weapons of Math Destruction