everything wrong with free software
"obedience breeds foolishness"
*originally posted:* jan 2022
sometimes, im pedantic. thats why in the initial effort to tone this review down from a hair-clenching squee, im going to say that im not 100% sure "digital vegan" is the single most amazing book ive ever read. there are others that deserve consideration.
ive read at least one or two books by lawrence lessig, i even tried to get through code 2.0 but found it too dry and depressing. im an angry writer at times, and depressing isnt ALWAYS a bad quality in a book, but at least the license was good. im told code 2.0 is easier if you get it from librevox. [NOTE: maybe not actually librevox, but some kind of audio thing].
the most amazing thing i ever saw from lessig was his talk at dartmouth called "rebooting democracy". what made it special was the way it flipped the world upside down, and changed something forever.
ive had similar experiences with books by daniel quinn, and benjamin hoffs "the tao of pooh". if lessig had written another book that i desperately wanted, "digital vegan" could be just that sort of book.
part of what made these books special was that they were answers to some question, and it was the question (at the time) that made them relevant. one of the best things that passion can do is pound against a wall in an effort to bring it down-- some walls really should be brought down, and without the will it simply wont happen.
then there is the famous "work smarter" approach; ive long been sympathetic to it, because a real strategy (not just will alone) is often a key difference between victory and defeat. in life also, the ability to keep going and not let one defeat be the end of it requires an ability to regroup, reexamine and reevaluate the plan. sometimes, things are even going "well enough" but you stop along the side of a road and find an incredible opportunity that you hadnt noticed before. sometimes we need to grab those opportunties.
far from being idle, good philosophy (as well as a deeper summary of battles and defeats so far) can be priceless as you find yourself reevaluating whatever "fight" drives you to action. "digital vegan" is a "work smarter" kind of book. for people asking "where do we go now?" with regards to free software, it is absolutely a must-read.
dr. farnell clearly did not set out to write a "perfect" book, and if he had he probably would have thrown it away and written another instead. like all good science, "digital vegan" is as much a starting point that raises new questions and challenges as it is a book of wisdom providing answers and putting old quarrels to rest. a "perfect" book could never do this.
although i am sincerely enamoured with this book, i will challenge parts of it in this review, but overwhelmingly i must say that this is a book that will probably have influence on my thoughts, writing and indeed actions for years to come.
this is a topic that NEEDS more voices, and even the experts are not sitting on a mountaintop in this conversation-- they must come down among us if they want to be part of the discussion. the author even states that:
> Bruce Schneier keeps an impressive looking list of 'technologists in the public interest' on his site. I wish it gave me more hope. But they look like a typical slice of white American academia, all of whom could be bought with loose change in Google's pocket.
its important to mention that i do NOT disagree with farnells assessment here, he has an important point and this is a vital critique to anyone who thinks change will come from simply throwing money at organisations like the fsf, which i believe are compromised and will remain compromised by corporate influence. in their place, i propose organisations that reject corporate influence (im not saying this is anything easily accomplished, only that it is possible) at the centre of their values.
did you notice the use of the phrase "white American academia?" this too is no falsehood, the diversity in academia is certainly lacking. but farnells association with the frankfurt school [NOTE: disputed? either way, i guess...] is not a secret i think, and it puts qualifiers on any over-arching strategy that might come from all this.
its not my desire to take this topic completely off course without returning to the core subject, but i believe a detour is necessary (if we are to achieve the sort of "balance" and autonomy that the book calls for). the frankfurt school, throughout history has sidetracked and reformatted questions about classism and turned them into open markets of intersectionality. this is not entirely a bad thing, and insomuch as social justice was about class-directed abuse from the very beginning, the frankfurt school has shed light on details of the past and present that the overall narrative of real societal progress has neglected to address or mitigate in a reasonable timeframe. the timeframe (and patience) is of great importance, but so are merciful exceptions.
i do not normally even grant them this, but there are moments when this faction has increased the resolution (the vital level of detail) on matters that were at risk of being oversimplified.
and like a halon fire suppression system it may be necessary, but halon also makes the air in a room unbreatheable. what invariably happens to the frankfurt school is focus on ever narrower details (you would think this could be avoided, when generally in practice it can only be mitigated after the fact) until the actual point of the discussion is lost-- it leads to to things like feminists silencing women who arent "feminist enough" as the voice of a woman no longer matters unless it toes a party line.
this is exactly the sort of stifling-for-a-good-cause that created the soviet union, and being too focused on ever-narrower details without a grounding committment to the "bigger picture" will end any revolution in progress sooner or later.
so how do you create balance between the infinitely detailed, singularity-like focus of the frankfurt school and the birds eye view of a "less radical" revolutionary? by staying grounded in debates that have more than one side as a feature, by not letting go of the "bigger picture" and by remaining tolerant (but not necessarily lukewarm) about opposing views.
it needs to be said that farnell seems well aware of the need for that sort of balance. it may be impossible for him to convey that need to the people most likely to carry a torch for the cause, even as he appears to make efforts to do so. for those of us who are wary, we can only watch it unfold and remember the value of the big picture. to use the authors metaphor of food, an ever-unfolding focus on detail can nourish us until it begins to consume us.
the real problem is that in the long run, the ultra-left finds too little support from the largest (and least elite, but sometimes admittedly-- though of course, not entirely-- white) social classes that progress might lend a hand to; and in order to sustain themselves the radicals side with the corporations and landowners.
in the beginning, both sides of the debate (the big picture and the little picture) are sincere in their desire to lift up society by and for the masses, and yet while intersectionality ought to be the mesh that runs throughout all of society (inclusively) it becomes a vast array of pigeonholes to disqualify everybody one by one. an ever-narrower, ultra-left "reverse privilege" becomes a cause of denying the needs of everyone including the poor, for the sake of serving the needs of the latter.
the big picture is not the entire picture-- the finer details can enrich it, but if we allow the frankfurt school to take over for us it is like viewing revolution through a tube of kitchen roll.
every time this happens, the ultra-left is recaptured (voluntarily!) by the landowners and wealthy-- the love of big tech by superficial progressives is a prime and relevant example, indeed the only example that ever led me to care about this overarching political drama.
but the only cure i know is to give the frankfurt school its due, and stay prepared to run like hell when they inevitably turn on us.
so you may think im joking (not in the slightest) when i say digital veganism has serious potential. its just that we must be very careful.
given his [NOTE: disputed?] sympathies, farnell (someone i absolutely admire and look up to) pairs them with a self-awareness [NOTE: not disputed! heh] that is both unusual and staggering:
> Those tagged as `frightened and helpless' (victims) can always be sold protection (insurance, CCTV cameras), while risk takers (alphas with disposable incomes) will buy tokens of freedom and agency (SUVs, jet skis). This system feeds on intersectional identity. It operates in a category theoretical way, dividing and dividing again until there's a place for everyone, and everyone has their place. Much like in Dickens' London.
passages like that one make me wonder if i really need to add any warnings at all.
when comparing digital vegan activism to taoism, its important to note that taoism is not only a path to inner peace, it is also an understanding and recognition of the chaos in the world.
"the tao of pooh" is not the only book ive read on the subject of course (and "the tao that can be written... etc. etc. etc.") but it tells a story of the wise philosopher who is invited to live in the palace, but notes that a turtle held prisoner there (or perhaps even kept there in its death, i was never certain) would likely prefer to be in the mud-- as would he.
the parallel is hard to ignore, as the point was not i believe that mud is the best place one should hope to find themselves in, but that freedom in the mud is better than imprisonment in a palace.
the mediated world big tech built for us is a palace that "allows" us to serve self-appointed digital royalty, but the vegan seeks the freedom awaiting us outside. note that as true as this is, it is of very little consolation to those who did not choose to live in the mud. somewhere between the swamp and the palace, most of us seek a freedom that also doesnt suck.
> It's overdue to restore balance. I don't desire a world entirely without smartphones and social media, I just don't want them to be unabashed tools of tyranny. The death of reason and democracy are not a price I am prepared to pay for convenience or the profits of a minority. Retaining the personal choice to /not/ have them without being made a pariah is non-negotiable.
i also dont want to give the impression that farnell is a foreigner to strategy, and he is clearly aware that a "war on digital drugs" would fail, warning progressives against forcing detachment or trying to slay the beast without first extinguishing its fire:
> Forced breaking up of Big Tech will only add more heads to the Hydra. Voluntary disengagement is therefore the only solution.
"greater choice between masters" is thus not a real solution. on that note, while i recognise that "alternatives to social media" may lack some of its venom and algorithmic control, i think the author has failed to notice that in practice the fediverse acts as the faith healers and snake handlers to the church of big tech. it is never a real concern of mine that someone "has religion", only when it is (so frequently) used to exploit and control people.
while twitter and facebook use algorithms and cause deadly international havoc, the fediverse creates several smaller cults of personality, joined together in common (sometimes ultra-left, occasionally alt-right) causes that have yet to be examined in a way that might shed light on the true and systemic failures of this microcosmic network.
farnell himself proposes we consider "protocols over platforms" (to be fair, he is quoting another author) but this adage will only take us part-way. the fediverse is a platform exploiting and posing as a protocol, and some protocols are (even deliberately, since the halloween documents if not sooner) only really suited to one or two platforms themselves. even pretending this is a debate only about technology (something farnell is able to avoid himself as a rule, but none of us are perfect) is a way of missing the point, because the fediverse is JUST another story about how people are abused via an abuse of technology, only now the groups of people "just like us" are doing it, on their own hardware. i link this abuse conceptually to an unrelated statement from the book:
> Don't imagine that universities attract and concentrate any fewer psychopaths than corporations do. Pursuit of shallow personal advancement and profit plagues both camps. The greatest enemy of good scientists today is therefore not the anti-vac crowd, uneducated mobs, religious people, anti-abortionists or climate protesters, it is /other/ scientists who have no moral compass, and bring our whole endeavour into disrepute.
when i warn the next wave of revolutionaries to stay mindful of the big picture, i dont think im touching on anything different than what is said in the above passage-- farnell himself eschews an us-vs-them mindset and places importance on speaking outside ones own in group. it the widespread aversion to this very thing that makes the fediverse just like twitter (a collective emulation of its worst abuses, and hardly immune to surveillance or infiltration by anyone with an account that can access one or more relevant instances) and nothing more than a "smaller hell", albeit one that runs on source code we have access to and freedom to change.
the abuse and neglect of free software by the movement itself is not something i expected "digital veganism" to touch on, and the closest it comes to doing so is saying (quite reasonably) that we arent done yet and that this will not be (already is not) a homogenous movement. that is acceptable, but for me personally anyone pointing to free software as a solution needs to integrate this into their plans either now or later, but nonetheless. free software will ONLY come to emulate the non-free otherwise, not unlike a victim that grows up to later abuse their children.
some quasi-free software or cultural licenses (proposed or adopted, but not by free software) have stated that you MUST change the original if you wish to distribute copies. this is a non-free clause and self-defeating, i would never accept something with such a license. but regardless of what the license allows, for users to be free it is NOT enough to have the theoretical right to change the software while bad actors twist its purpose and turn it against the user: we must ACTUALLY CHANGE the software so it doesnt do things in bad faith, or eventually walk away from (replace) the software that acts against us.
here is a lesson that digital veganism can teach, while free software has stubbornly neglected the obvious: it doesnt mean a damned thing if "its vegan" if it kills you anyway.
since 2011, richard stallman (and arguing with stallman about free software is like arguing with einstein about relativity-- you have the right, but youre up against the person who knows more about it than anyone else; hopefully though, one day a free software stephen hawking will come along) has promised to show a soft hand to anyone creating free software in bad faith. i certainly dont believe it was intentional, i believe it was an oversight. what has happened to free software since that time is pure (nearly unmitigated) chaos. anti-features are at least recognised by a few within the movement, and almost nothing is done (let alone officially) about this, which i consider nothing less than selling out the public at the behest of corporations.
none of which is farnells fault, and i think we can forgive him if he considered it outside the scope of the book-- or did not consider it at all. this sort of thing is creeping into the awareness of the free software movement at a miserable and tedious pace, and i fear that by the time the "100th monkey" is reached it may not even be called "free software" anymore. though "free computing" is still worlds better (and more honest) than "open source".
scientists never write perfect books because science is not in the perfection business, but we should give this book its due. it asks plenty of questions that matter, and you will think it self-righteous if you read only the following passage:
> Ask yourself, why can't you stop looking at that phone? Why are you probably, right now, sending data to corporations whose open aim is total cybernetic domination of the Earth's population? Why don't you stop supporting child labour? Why do you contribute to mountains of e-waste, flooding the Earth with toxic compounds that cause sterility in thousands of species? Why do you buy products designed to be obsolete within a few months? Why do you defer judgement on deeply human affairs to `artificial intelligence' which has much less sense than a nematode worm?
you wouldnt ever imagine from those (nonetheless important) questions that this book is incredibly humble, human and above all, honest:
> The subject of this book is that of /Retaking Technology/,
perhaps it would be better still to say that the book is incredibly humble, honest and above all, human. to be certain, these are goals that benefit as much from our humanity (which is common) as they do from our honesty (which at least has its moments).
for a book that may actually help you worry less (and fight more, though maybe now with a more calming, reassuring and thoughtful self-confidence) it contains plenty of warnings:
> In 2021 there is no such thing as a secure smartphone, in any way shape or form. The same is true of IoT devices, your smart TV, your car, your kids' XBox and school computers[...] We need to come right out and say that cyber-security is a failed project.
> The more computers there are on a network the less secure it is.
i dont believe that "group therapy" is mentioned anywhere in the entire text, but the merits of such therapy (witnessed routinely in 12-step programmes) are clear: even in a great struggle, there is incredible reassurance in knowing you are not alone. that may be one of the greatest (but not the only) gifts this book has to offer the coming digital generation.
in practice, warnings really only wake most people up for a moment, or for part of the duration of a real crisis. if the crisis is long enough (days, months, even years or more) people do go back to sleep.
the transition from warning people to educating them is a vital step on the road to progress, and this is a book that will do far more than warn people. a tech press treadmill, by comparison, will simply stir the public to let them fall back asleep-- again, and again, and again. real education is both honest, and essential.
> Buy a simpler, more reliable product.
build a simpler, more reliable solution. this is both a warning to free software (or it should be) and a "teachable moment."
farnells solution is thankfully not one-size-fits-all, but a growing (and collaborative, if unofficial) curriculum of not only awareness and services, but learned self-defence. being passive will not work, waiting to be rescued may well prove foolish, we must set out both together and "on our own" to create a better future. this is not the nature of one-size-fits-all anything:
> Educational technology at the primary and grade school level is overrated. They will pick it up at an appropriate age. And so to the second important step. Even better yet, help them learn for themselves.
> Teach your children to hack. Teach them to be brutally critical. Teach them their digital roots, and how to get root. And if you cannot, at least step aside and give them the encouragement to be curious and break boundaries for themselves.
> Help them to be the digital Rosa Parks of their generation, who will challenge domination, and overthrow it.
i can certainly get behind that. but going futher, homogenous solutions should create great scepticism:
> Of course, my argument here is that diversity of technology is vital to creative innovation and societal resilience. Without a diverse reserve, eventually some catastrophe will destroy a system.
and there is hope, if we willfully stand against the tide of technofascism:
> Totalitarian systems that refuse to tolerate any marginal diversity have no long term future.
i have stated before that technology is used to disguise medieval-church-like control as something helpful or even needful. im thrilled that farnell says the same (i am often happy to see experts speaking for a cause i think is important).
> For example, `Algorithmic policing' extracts, freezes and then amplifies patterns of racism in order to ``better serve ethnic minority neighbourhoods''. Calling the racist an `AI', makes it no less disgraceful. Indeed, it's worse. Having swept these racist values under the rug, as it were, we can now pretend it's a technological problem.
as techrights has noted countless times (credit where credit is due) the hypocrisy of big tech is overwhelming:
> Leading the list of countries supplying oppressive technologies are Britain, the USA, Canada, Australia and Israel. The main customers are countries like Afghanistan, Bahrain, India, Kuwait, Pakistan, Qatar, Somalia, Sudan, UAE, and Yemen, who imprison homosexuals, womens' rights campaigners and democracy advocates.
> Our own governments use the tired but ever effective ruse of invoking crime, national security and terrorism as justifications, despite no strong evidence of any substantive links [Fung14] between privacy-respecting technologies and terror.
other lessons youll find in this book, include the fact that digital rights are human rights:
> Outright discrimination against computing choices is on the rise, and it is something that needs robust challenge. Workplaces discriminating against those who cannot or will not adopt an employer's preferred technologies are an emerging threat.
> Far worse, companies have tried to get their employees to install bogus encryption certificates on their personal devices, so those companies can snoop on their staff.
fighting back is worthwhile:
> In most of the cases I have researched these attempts led to swift legal action, sabotage, loss of reputation, and ultimately to the organisation backing down and apologising.
commercialism is (or wishes to be) everyones boss:
> You are told when you will buy it. You are told when to dispose of it. And during the time you carry around this always-on surveillance and tracking device, it is never really under your control.
the psychological effects are real:
> Aspiring tyrants enamoured with cybnernetic governance make an unholy allianace with peddlers of social media and smartphones. For them, a society built on pushing the buttons of narcissism, envy and shame is the goal. Tens of thousands of teen suicides and widespread societal depression, is just the price we pay for ``total information awareness'', and ``nudging control''.
> A dealer needs to hook only one or two customers, while a drug cartel sets out to ensnare a whole nation. Giant technology companies are just such cartels.
of course i have focused on the parts of the book that appeal most or create the strongest desire for a response from me personally.
whats at stake is nothing less than freedom-- note this is coming from a computer scientist and teacher:
> It is not hard to imagine a tactical alliance between Big Tech and governments to suppress particular modes of free speech where it leads to too much dynamism.
farnell says that we must do nothing less than "assert our moral right to be free", that "A good start is to withdraw your moral consent."
digital veganism rejects the status quo:
> We are only anti-progressive if progress /must/ be towards tyranny and domination. To imagine a different kind of technology, and kinder ways of using it, is simply rejecting the status quo.
> And I would counter that, by that logic, it is Google, Microsoft and Apple that are `anti-progressive', because their conceit of progress is so short-sighted.
it does not reject society itself, it is not even a rejection of technology itself. instead, it carefully (and unassumingly) weighs costs and benefits:
> In a naive model of progress, benefits always supplant downsides.
the book contains an incredible description of mainstream "ai":
> What the press call `AI', what we computer scientists call `machine learning' is really advanced statistics. We attach weights (distances) to vectors of information.
> Imagine a complex mobile made of pendulums and windmills that swings and rotates in intricate ways when pushed. Then we make filters that reinforce or diminish those weights until a predictive model emerges.
> When you push on it in a certain way (stimulus) it moves in a certain way (prediction). That's fine for getting a machine to write essays or music. But when applied to people it is unethical.
regarding the security state:
> The problem is that by making things more `secure', obsessing over `certainty' and `identity', we are making societies that are not worth living in for human beings.
> We have a technological auto-immune disease.
> Change happens on a personal level, but also by expressed demands. You must change yourself /and/ the world.
> Demands for digital privacy, dignity, transparency, freedom, protection of children, defence of basic rights and the pursuit of justice under the rule of law are things that nobody should feel bad about insisting upon. You should not be browbeaten into believing they are unobtainable just because computers came along.
on who to rely on:
> There can be no appeal to authority here; you must do it yourself. Not because `authority' is to be disrespected out of hand, but because it is hopelessly incompetent and in disarray itself. Nobody knows how to control Facebook and Google.
on where to fight:
> Social media is a problem and it cannot be beaten by going there to argue against it. Use the liberty of movement while you can. Take the battle elsewhere. To /your/ ground. Ranting on Facebook or Twitter monetises your discontent.
and on the power of striking workers:
> If ``Techxit'', a morally principled `walkout' of programmers and system administrators were to occur, the Big Tech sector would be brought to its knees and corrected in short order.
simplicity creates great potential:
> When I was a ten year old we were able to build and program computers because they were relatively simple in 1980.
while complexity is a limitation:
> Today's computer systems are immeasurably more complex.
who is to blame:
> We made a generation stupid to benefit ourselves.
> For twenty years, while we taught Microsoft Excel instead of programming, education that could make students masters of their own technology was squandered. We lost two generations of innovators.
if we learn from these lessons, if we share and teach each other, we may be able to save technology (and ourselves).
"digital vegan" does not mince words about the environmental devastation caused by e-waste and technological abuse, which perpetuate themselves by contributing to depression, apathy and learned helplessness.
to be certain, fixing humanitys relationship with technology is not the only essential battle ahead of us, it is only one of the essential battles.
nor is it always best to think of it as a battle, but sometimes as a healthier lifestyle. (neither the author nor i seem to put much faith in health-tracking surveillance trinkets-- even some of the most hopelessly giafam-yoked people i know seem to have rejected these as an annoying, tedious and intrusive fad).
overall, the feelings i finished reading this book with more than any others, were excitement and reassurance. not since lessig or doctorow have I been so eager to read a second book from the same author.
over a year ago, i said "we need more stallmans". here you go, this is one of the most promising candidates so far: a computer scientist, an activist, a sceptic, maybe even an iconoclast.
instead of a legion of parrots, lets hope the next generation can learn from heroes and leaders (and thinkers) like stallman and farnell, and chose to create the movement for their own freedom based DIRECTLY (but not "verbatim only") on everything we have been taught so far.
i dont have a problem with the four freedoms being carved on stone tablets (even if they were originally three). what i want-- and what i believe farnell calls for-- is a LIVING lesson, a life lesson, in which we forge our defences together, based on all our ideas.
the greatest gift of history is a better understanding of the present. as we celebrate our greatest heroes and strive to be more like them, we can carry these lessons through from the present to a future of our making-- and walk further away from the digital purgatory crafted by the big tech companies, who have done more than enough for us to relegate them to history.
drm-free, with a gratis preview: