There is no question that humans like labels – Balmain, Etro, and Prada fall in one category, Democrat and Republican, fall into another; and then there’s Millennial, Gen X, and the dreaded Boomer. (Yes, these are mostly American ones, but the rest of the world has their own flavors . . .) We are happiest when we have a clear sense of who we are dealing with.
When it comes to Art and Design, however, it’s not the “who” that’s important – it’s the “what” and the “why” that matter.
I’ve gone into the discussion of these two before, and am reinvigorating the topic again after listening to a conversation between Glenn Adamson and Carrie Rebora Barratt on a different topic, Craft. Different, but no less integral.
Glenn was speaking about his book Craft: An American History with Carrie at Longhouse, where she has recently been installed as Director. Longhouse being the home and “grounds” (if the space surrounding one’s home contains world-class sculpture, plantings, and landscaping, they get to be called “grounds”) of the . . . maker. . . Jack Lenor Larsen.
The very lively conversation covered lots of (ahem) ground – but the thing that struck me most was the amount of time dedicated to this obsession with the titling of what is, effectively, “talent”. There were “artist versus designer” elements, “craftperson versus artist”, “maker against all of the above”, and a couple others. In my opinion, these are unhelpful monikers, and they distract from the work that those who bear them produce.
It was brought up by a member of the audience that Larsen referred to himself as a “maker”, though the topic was broached more from the marketing angle of Making (as in Maker Faires, MakerBots, and the overall Maker Movement). This fits with Larsen’s place in the whole generational title obsession: Textile Designer would have been his generation’s name for it, but that is far too limiting. His very forward-thinking (at the time) use of the broader term of “maker” fits his overall persona.
Today we – and I feel a very active part of this royal “we”, though it is not “my” generation that is necessarily practicing it – would probably put Larsen in with the Material Designers: people who are designing the building blocks for other Makers/Artists/Designers. Larsen’s work appeared in architecture, transportation, fashion and numerous other applications, so to speak of him as simply a textile designer seems almost insulting. Larsen put his understanding of textiles and their manufacture to use in elevating them within the larger world of the Built Environment.
Were they “Art”? Were they “Design”? Were they “Craft”?
Yes – absolutely.
The title isn’t important: what IS important is that Larsen used his knowledge and skills to improve the state of everything from fiber selection and usage to building performance to – of course – the aesthetics of the environments he dealt with.
Current Material Designers are keenly focused on environmental concerns, but are also addressing other performance characteristics, whether through the incorporation of technology, increased durability, or colorfastness. These attributes will then be spread far and wide into all aspects of . . . Art, Design, and Craft.
Design is constantly talking about the blurring of
boundaries, the smashing of silos, and adding some new prefix to the term “disciplinary”
to prove that it impacts all facets of life. As a professor in one of those
______disciplinary programs, I also fielded a lot of questions from students
about their future: “what, exactly, can I do with a ‘Design Studies’ degree?”
My answer to that was something along the lines of “you can
do whatever job you want to – just better.” I meant that they could take jobs
in any corporation, any school, fill any role, and they would have a leg up on
how to make things better.
The news (pretty much everywhere) has been dominated by
stories of the vaccine rollout, and how (pretty much everywhere) it has been a
slow one. There has been an effective vaccine available for about a month now,
and the arrival of these “silver bullets” has been talked about for many more
Yes, there are (some, just a slight puff) political headwinds.
Yes, there are realities of how the vaccine needs to be preserved. Yes, there
are varied compatibilities between geographies and health networks. Israel has
exploited many of its inherent cap/abilities to lead the world by vaccinating
just about 40% of its population, and the UAE is doing the same to see just
about a quarter of its people vaccinated (all numbers as of Jan 23). After
that, the numbers get more and more sobering: the UK (#3, just under 10%), the
US (#5, just over 6%), and when the list hits #7 (Germany) the percentage drops
below 2% of their population being vaccinated.
That said, there was one story in the US that really pointed
out the critical – downright life-impacting – need for design knowledge in all
areas of an organization.
Commonly-used syringes cannot expel the
entirety of the medicines in them.
Commonly used syringes are single-use, and
are disposed along with the residual medicine in them.
Pharmaceutical companies ship vials of
medicine with sufficient “extra” volume to account for this known waste issue.
Given the extreme pressure placed on
pharmaceutical manufacturers to deliver as many doses of vaccine as quickly as
possible, these manufacturers (had the audacity to) want to “count” the
overfill amount as an additional dose.
This “overfill” volume is equal to a 20%
increase in the number of available doses.
But – the only way to dispense these doses is
with “specially-designed” syringes, which are – of course – not readily
Crazy as it sounds,
let’s put COVID-19 aside for a moment . . .
Many are aware of the
“Pharma Bro”, Martin Shkreli, who raised the price of a medicine in his portfolio
of drugs by a factor of 56 times. (This is a price discussion, not a delivery
one – I know that his was a pill, not an injectable. I now return to delivery-related
Many others (in the US especially) are much more painfully aware of the not-even-close-to-the-cost-of-water cost of insulin. More than eight million people in the US, and some half a billion people worldwide take some form of injected insulin. For those in the States, a vial of insulin costs around US$300.
And, depending upon
how that insulin is being delivered, some US$60 worth of every vial is being
The cost of
life-saving medicines has been a topic of discussion for years – decades, even.
It has always been seen as some mix of government intervention, insurer
control, and industry greed. There has always been an element of “process design”
in these discussions, who pays for what, when, and how; but to see this very
obvious Design 101 flaw is a sobering thing.
Even before the COVID-19 vaccine became available, people were writing about syringes: this article estimated that some 4-8% additional doses could be extracted from the same number of vials that were ordered, if the more efficient syringes were used. For a month now, those doses – around the globe – have been thrown out. For many more years, many other drugs have also met the same fate.
I try to impress
upon all students the costs of waste – in materials, in manufacturing, in
transportation, in energy consumed. I’ll have to add an additional category.
For years, I walked by a colleague’s door – a member of the Graphic Design Faculty – that had a poster proclaiming:
It always struck me that an equally self-aware (and self-deprecating) member of the Fine Arts Faculty could have the opposite poster on their door: “Art is Design for Losers”.
In the end, I have sort of hewed toward the following two sentiments, expressed by people far more worthy than I, and in far more thoughtful terms:
And, from Robert Capa:
“Art is some guy using a medium to change how you see the world. Whereas design is changing how we live in it.”
(Sorry, but Robert Capa, as a photographer, doesn’t have a slick graphic . . .)
The gist of what I am getting at, however, is that the two are not all that far apart in many respects – though they are often forcefully separated.
Trade Shows can often highlight both the differences and the similarities, and the most recent Salon event here in NYC was a good example of how the two coexist quite comfortably. While I normally prefer shows that feature experimental works rather than established (and expensive) ones, I was lucky to know a design team that was being exhibited in Adrian Sassoon’s booth; and, free ticket from them in hand, I ventured into Manhattan.
As is often the case – experimental or established – these shows really highlight the importance of knowing one’s materials.
The aforementioned design team – Vezzini & Chen – are a perfect example of this: Cristina Vezzini works with ceramics, and Stan Chen works with glass, and the two meld their pure expressions of their materials in remarkable lighting pieces. The two have been honing their craft since meeting at the RCA in London, and have produced memorable pieces all along the way.
What I love about their pieces is that both of the materials work with light in different ways, and they exploit those qualities in many ways that are so similar, and yet so different.
The title of the collection from which this piece comes is Inverted Gravity, one of the more engaging titles around – and far more understandable in this photo:
Obviously, the pieces are dramatic on first sight: the title only serves to drive home the point that the designer is playing with.
Our brains expect to see materials like marble on the bottom of objects – and glass on the top. That we are able to slice thin stone veneers with great accuracy, allowing “stone” to live in places that it couldn’t before, slightly undermines the fact that glass as a material – while brittle – is quite strong. Lehanneur’s glassblower subjected an initial bubble prototype to some 250 kg of weight without incident.
In this scenario, the designer is deliberately playing his materials against type, while still celebrating their inherent properties and qualities. In other pieces, however, Lehanneur cuts across these elements, creating pieces that defy their material origins.
I’ll be honest, I was ready to pull out the hatchet on this piece: so many artists/designers are enthralled with the ability to “quickly” and precisely mill hard materials, that they forget what they are doing (or, they know what they’re doing, and I find it wasteful . . .).
I thought that this piece was CNC milled from a solid piece of marble, which – while enthralling in its delicate presentation, and running counter to marble’s solidity . . .
would be a tremendous waste of material. I was, therefore, excited to hear that the piece was cast from marble powder and resin – a much better use of material.
Moving to the back of the room, there was an equally delicate and enthralling – and circular – piece.
We could get into a larger discussion of depicting a Black Hole – the expression of infinite gravity – using paper, but I would prefer to focus on the use of paper – Washi paper, specifically – in lighting. This, one of the most humble of materials, becomes absolutely transcendent when applied to lighting; and the use of it in layers like this is absolutely amazing.
I had initially wondered if the material was paper – or ceramic. Paper is usually considered too mundane for such a headline piece, and ceramic or porcelain can have the same translucency and papery feel. While the light was, indeed, paper, there were wonderful pieces that exhibited the same look – in ceramic . . .
When we come back, we’ll get back to lighting – and back to a more traditional material for it: glass.
A couple of years ago, struggling to come to grips with the lunacy of the US presidential elections but not wanting to lose focus in the posts, I used a political video to introduce a materials topic. The video was mocking (then candidate) Donald Trump’s incessant talk about China, and I linked that to the topic of rare earth materials and China’s virtual lock on the global supply. The core commentary was on the geopolitical implications of this stranglehold, specifically citing an incident between China and Japan.
Maybe he should have read the post . . . Three years later, Trump’s battle with China has brought the United States to the brink of the same predicament: China is beginning to talk about the possible restriction of exports of rare earth materials to the US.
The list is really longer than 35 materials, as “Rare Earth Elements Group” is one material – though there are 17 elements within the group. Regardless, the US imports at least 50% of 23 of the 35 stated materials – and 100% of the Rare Earth Group. And it gets 100% of that Group from China.
If there is some sort of restriction on the flow of these materials, it will impact industries from electronics to automotive to aerospace. The demand for these materials, and industry’s reliance on them, has led to the funding of research to determine new sources, both foreign and domestic. In the US, researchers in Virginia are looking at the extraction of the minerals from Trump’s other political tool – coal.
The team at West Virginia University found that acid mine drainage, literally acidified water that results from geological disturbances, naturally concentrates rare earths. As coal mines produce significant volumes of this discharge, and are required to treat the waste. They do so in such a way as to produce solids that are rich in rare earth minerals, and serve as good feedstocks for their extraction.
If there is a silver lining to the decidedly black cloud that coal hangs over the environment, this could be it. Preliminary estimates show that the discharge from the Appalachian basin alone could provide the needed materials for the US industry. In addition, raising the value of what was previously seen as a difficult waste material should lead to significantly lower volumes of it being dumped into waterways.
My love of socks notwithstanding, the Fashion Thing – in general – is my least familiar territory. I love to see Design as a solution to a problem, and I can’t really see how claiming “_______ is the new black” does that. I absolutely believe in the power of personal expression and individuality, but I feel that Art does that better – so, maybe, Fashion is more like Art than Design for me?
“Tech” is another aspect of the larger Design Universe that I am uncomfortable with. However, schools around the world promote Arduino in the classrooms and technology companies in their incubators, so – whether I like it or not, there is no question that Tech is here to stay.
I do, however, “get it” as far as Tech is concerned: it has the greatest universe of possible applications, and the best margins in the business world. The question I always come back to, though, is “how many apps can the world really need?” (And, you’ve heard it before, don’t get me started on Uber . . .)
That said, I am a firm believer that all of us – from designers to materials scientists to coders – will be coming together in service to a trend that will redefine the way we live: wearable technology. It is my belief that Wearables represent the best of what Fashion and Tech can offer.
As with most of my research, I wanted to see where it all began: what was “The First Piece of Truly Wearable Technology As I Have Chosen To Define It”?
[I am not talking about eyeglasses or wristwatches, nor am I referring to Walkmen or GoPros. I am talking about garments that integrate – through more than duct tape or a couple of straps – some form of technology for the benefit of either the wearer or those around them.]
Thinking about my parameters, I assumed that it would have occurred in one of the various space programs – either US or Soviet – of the late 50s/early 60s.
As with most of my assumptions, this one was completely wrong.
The early “space suits” were glorified plastic bags that only served to maintain pressure (and a breathable environment) for the wearer.
Developments were focused more on increasing mobility – from pretty much none – than adding technology. There were the obvious communications additions but, as mentioned earlier, I am not interested in projects where they simply bolted a microphone to a helmet and put a radio in a backpack.
and venturing deep into the history of technology/fashion/ design, brought me to an unusual place: Madison Avenue, shopping mecca of New York, and a generally mind-numbingly boring corner of the world (see my opening comments on Fashion).
There, in the mid 1960s, a boutique called Paraphernalia invaded the shores of the otherwise staid Upper East Side like a pirate ship at a Yacht Club Regatta.
The store, a transplant of London’s Mod obsession, featured outfits that were not dry-cleaned but rather wiped down with a damp cloth. It was frequented by Edie Sedgwick, and it featured Betsey Johnson as a member of the creative team (she had been an assistant in the art department at Mademoiselle – the Teen Vogue of its day – after winning a contest to be a guest editor). The store also hired some of the most outrageous talent in the fashion industry: one of whom was a young woman (though they were ALL young women in the design department) named Diana Dew.
Mind you, I had never heard of the store – much less Diana Dew – when I wandered into this rabbit hole, so to hear that Johnson was there, that Sedgwick was their Number One Fan (and brought Andy Warhol along), that the Velvet flipping Underground played the opening party, it all pretty much blew my mind. Man.
That this intersection of Design, Music, and Youth gave birth to some remarkable things was not so hard to wrap my head around. What was hard to fathom was that a.) so much of it was unknown to me and to others interested in this area, and b.) many of these groundbreaking designers simply faded away.
The store was created specifically for young people: it was supposed to offer them the newest of the new at a price that they could afford. They hired young, mostly unknown designers and gave them free reign to create whatever they wanted to. They tapped into the psychedelic scene – with its music and focus on expanding minds through both pharmacology and technology.
Weird things were bound to appear – and where that happens, epiphanies happen. (Sadly, among other prescient aspects of the store, it heralded the arrival of “Fast Fashion” and garments that were envisioned as being worn only once before being disposed of.)
Given the anecdotal evidence – really all that is out there – it is shocking that Diana Dew is not better known in . . . some circle. She has very little written about her, despite stories of incredible accomplishment.
She worked with Bausch and Lomb to create specific tints for lenses, worked with Sylvania to create flexible electroluminescent (EL) sheets, experimented with thermochromatic (changes in temperature change the color of the material) pigments/materials, and she managed to devise a battery system to power her dresses (for 5 hours) that was small enough to wear on the belt of the dress – and was rechargeable (and I only read about a single mishap). Groundbreaking stuff.
(Again, for emphasis: we’re talking about 1965. EL wire and tape still astound people today . . .)
Like many of her creations, the dress was – as far as Fashion was concerned – pretty basic: a par-for-the-course minidress with spaghetti straps, and without the inclusion of the systems she devised it would hardly earn a second glance. That might have been purposeful, as a sort of decoy to increase the surprise when the lights came on.
The various EL panels were connected to some combination of programmed circuitry and a potentiometer – basically a speed control that allowed the wearer to change the timing of the strobing of the lights. Step on to the dance floor, flip the switch, and the panels would light in a pre-programmed sequence – all that was left was to adjust the speed of the flashes to the music, and you were instantly transformed.
So, Diana Dew is hereby crowned the Mother of Wearables (in my book), and I look forward to finding out more about her; however, this has also been the mother of all diversions.
Regardless of where you were, 2016 was a pretty difficult year: terrorism seemed omnipresent, as did refugees; countries saw historic votes carried by those who sought to close borders; and an overall sense of “us versus them” seemed to pervade the world – with neither side able to comprehend the other’s position.
Rather than pile on, I would like to bring you a good – seriously good – story from last year, and the world of materials.
Every year, Google sponsors (along with Lego, Virgin Galactic, National Geographic, and Scientific American) a competition called the Google Science Fair. It’s open to 13- to 18-year-olds, and has a number of different prizes associated with it: “LEGO® Education Builder Award”, “Virgin Galactic Pioneer Award”, and other wonderfully brand-conscious prizes. While they don’t release the number of entries, one would have to assume that there are many people interested in winning between $15,000 and $50,000. No?
In any event, the Grand Prize winner in 2016 – US$50,000 – was a 16-year-old student, Kiara Nirghin, from South Africa: and her subject was Super Absorbent Polymers.
First and foremost – congratulations to Kiara!
Secondly – Super Absorbent Polymers? Um, why? Seems an odd topic for a science fair project; but, hey, I’m happy she took Materials Science to the top.
Super Absorbent Polymers (SAPs) are a type of plastic that is capable of bonding to water molecules in an incredibly efficient manner. (In addition to their name, most of the literature talks about these plastics “absorbing” the water; but the concept of something “absorbing” up to 300 times its weight – or 30 to 40 times its volume – doesn’t quite make sense to me.) They are used in diapers, adult incontinence underwear, sanitary pads, and other things that we attach to our bodies to soak up the strange blue liquid that emanates from them.
From this . . .
to this . . .
in two minutes!
Other applications include filtration, wound dressings, moisture protection for electrical applications, and agricultural uses. It is this last item that was the focus of the project.
SAPs are very effective, when added to soil, at keeping the water where the plants need it. You know when you (finally remember) to water your fern, and the water runs completely through the pot and overflows the saucer under it? Adding SAPs to the soil would allow that water to remain inside the pot, meaning you would waste less water, and your fern would have a significantly better chance of surviving your neglect. In areas that are prone to drought, SAPs can be incredibly valuable.
That said, people have recognized the need for a product that performs this function WITHOUT having any potential impact to the soil or plants that come in contact with it. Meaning that a more natural solution needed to be found for crop plants.
Remarkably, most of the SAPs used in agriculture are still cross-linked polyacrylic acids, partial hydrolysis products of starch–acrylonitrile copolymers and starch–acrylic acid graft copolymers. That’s a lot of acrylic sitting in the soil, and potentially putting compounds into the plants.
Some companies, like Ecovia Renewables LLC, are looking to produce these materials through bacterial fermentation, making a final product from natural feedstocks (like corn and other renewable sources). Their BioGels™ are “100% non-toxic, eco-friendly, and biodegradable.” But they’re not available yet, and they only promise that they will be economically competitive. Not knocking them, just saying that the product is not necessarily ready for prime time.
According to her summary, her research topic came from a recognition that the severe drought conditions in South Africa – the worst in decades – were straining food production. The leap to SAPs is remarkable in and of itself, but the inclusion of waste material as the base for her explorations is inspired.
Noting that the peel of an orange is more than 60% polysaccharides, her hypothesis questioned its use as an effective building block for an SAP. After some fiddling with polymerization techniques, and the inclusion of another peel/skin (avocado), she created what is probably an edible SAP, that tested remarkably well when compared with synthetic varieties currently on the market.
By using a waste material as her raw material, she immediately keeps her costs low – trash is a relatively inexpensive material. Secondly, she’s potentially removing a volume of material from the Waste Stream: while in the US, the waste from juice extraction is completely reused, I don’t know what the situation is in South Africa. It could be argued, however, that putting this waste to use in the creation of SAPs is more valuable there than putting it into feedstocks.
Remember when we thought that the Ecosphere was such a cool thing? A completely sealed system that produced its own food and oxygen and used its waste as fuel and food?
It was supposed to be a metaphor, not a corporate gift. We live in a thing called the Biosphere – same same, but bigger.
Managing resources is increasingly critical to the success of business – not to mention the health of the planet – and the more that we can look to turning our “trash” into useful products, the better.
Congratulations to Kiara for inspired thinking, and to the judges for recognizing the importance of her project.
As Giving Tuesday has just passed (which, as a recent expat, I had never heard of), I thought that I would reflect on what I see as a troubling phenomenon: taking, everyday, as if there’s no tomorrow.
Having lectured on the topic of Sustainability for a number of years, the standard assumption from most people is that the concept is solely based in some form of environmentalism. As a “materials” person, I can certainly appreciate that this is a large part of the concept; but I do subscribe to the idea that Sustainability is anchored in three specific areas. The environment is one, but Sustainability also includes human capital and economic capital.
Possessing a different middle consonant in my master’s degree – a “B” instead of an “F” – than most of my colleagues, that last leg of the triad has also always been of interest to me, and it is the least spoken of among the three. Yes, Triple Bottom Line and the Three Ps do reference it, and that consonant refers to the most unsustainable element in the economic review: Profits.
Without belaboring the point (too much), “Normal Profit” is defined as a company’s state when total costs (including opportunity cost) are exactly equal to total revenue. Or, put another way, “Normal profit is defined as the minimum reward that is just sufficient to keep the entrepreneur supplying their enterprise.” If a company makes more than this base level, it is referred to as “Super Normal” and “Abnormal” and – interestingly – “Accounting” profit.
We hear plenty about profits that seem “abnormal” these days, but those profits are hardly undeserving of the moniker:
US banks took home profits – profits, not revenues – of more than $43 billion in one quarter. In the case of Apple, their share of the smartphone market profit was 104% (is it a “share” if it is more than 100%?), which translated to $8.5 billion. For one product line, from one company, in three months.
Anyone who has even the slightest interest in “raising capital” no longer thinks of a unicorn as a mythical, horned horse: no, a “unicorn” is a startup (which is a loose definition in and of itself) with a market capitalization of more than $1 billion. Many people are also familiar with the investment term “tenbagger”, which refers to an investment that appreciates to ten times its initial price – meaning a 900% increase.
I would argue that these are unsustainable concepts, and that the world of finance has radically altered global economic sustainability through its pursuit of truly super-normal profits. This impact has also had profound effects on the way Academia both educates its students and conducts its research.
The focus of many educational programs – from design to business – is on “disruption” and “innovation”: two words that (if it were up to me) should be relegated to that particular Hell that now contains “granular”, “synergy”, and “robust”.
Why is the focus on these elements? Because that is what investors not only want, but need, to see in a business. And the investors are driving the school bus.
I’m just worried about where it’s headed.
One of the most high-profile examples of where I do see it headed is/was Theranos. The moment that Elizabeth Holmes dropped out of Stanford and used the money that her parents had saved for her tuition to start the company, the diversion of money from Academia to Industry had begun. Ultimately, Holmes and company would raise more than $700 million from investors, and would have a market capitalization of some $9 billion earlier this year. Then, the gravy train (sorry, mixing my transportation metaphors) went off the rails.
Turned out that the magical Edison (did they have to license that name – and is he rolling in his grave?) technology, shrouded in “proprietary” secrecy, wasn’t so magical after all. Thirteen years of research, and hundreds of millions of dollars, were spent in support of what appears to many as a complete fraud. While Theranos is the most spectacular implosion, there are many other research-based startups (technology mainly, but also other medical ones) that are operating on the fringes of what is possible.
[For more on this, I recommend Paul Reynolds’s blog . . .]
Which brings me back to my world – the Material World.
In that world, one of the most dynamic – if not THE most dynamic – areas of research and exploration is Synthetic Biology: effectively, designing organisms that can either produce novel compounds or produce known compounds through significantly more efficient and sustainable methods.
Research into this area is the very definition of innovative and disruptive. Metabolix and Danimer Scientific have been working to expand the market for the biopolymers known as polyhydroxyalkanoates (PHA). PHA is a plastic produced by bacterial fermentation using basic, natural foodstocks like sugars (corn syrup) or oils (canola oil). Modern Meadow is working with bio-engineered collagen to produce synthetic leather that is virtually identical to the real thing (and, it would seem, is also the stepping stone to other materials).
All of this work will be made more efficient, economical, and “tunable” with a thing called CRISPR.
One of the most dramatic breakthroughs – period, full stop – in recent years is the identification, and exploitation, of Clustered Regularly Interspaced Short Palindromic Repeats, aka CRISPR. These repeating sequences of DNA are believed to be a sort of “genetic memory” of past viral invasions; but their importance is in their use as a guide for the precise “editing” of DNA. With a tool like this, highly targeted organisms can be efficiently created for the production of an ever-widening range of materials. However, to focus on materials is to miss the real point of the technology: using this tool, the cure for a wide range of genetic disorders appears well within reach.
The full genetic modification technology is known as CRISPR/Cas9 and, according to James Haber, a molecular biologist at Brandeis University, “[it] effectively democratized the technology so that everyone is using it. It’s a huge revolution.” Previous technologies were both expensive and complex, but the new technology can be accessed for as little as US$30. This is not to say that anyone with enough money for a Star Wars action figure can start churning out glowing rabbits: the technology is only “democratized” for knowledgable researchers.
Or is it?
A massive legal battle is currently underway to determine who gets to lay claim to the ownership of the technology; and the combatants are – remarkably – not companies, but schools. However, where there is smoke, there is fire – and the primary reason (not the only reason) for the battle is how much money is at stake.
On team Left Coast is the University of California, Berkeley (and the University of Vienna, but they always come after the comma, so I’m putting them as “teammate”), and on team Right Coast is the Broad Institute, a partnership including Harvard University and the Massachusetts Institute of Technology: no slouches are to be found at the table. The parties have been engaged in some form of patent litigation since the beginning of 2014, and early last year UC Berkeley filed a “patent interference” motion that effectively sets the stage for a “winner takes all” outcome.
Exciting as that is, the real story is what lies behind the tables. Money, and lots of it.
In the three years since the technology was discovered, more than $600 million has been raised (other estimates put the number at more than $1 billion) by a number of companies hoping to exploit the opportunity.
One of those companies is Editas – get it? “edit”? – Medicine, whose founders included (note the past tense) a member from each of the teams that submitted patent applications. Editas have already licensed the technology from the Broad Institute (Team Right) and, based on this arrangement, raised some $200 million in venture funding and then, just this past February, went public – raising an additional $95 million. Not long after that, they saw their market capitalization eclipse the mythical $1 billion mark.
And they don’t even know if they own the technology.
It is numbers like those that will keep this case in court – money, and the absolutely bizarre circumstances of the patent filings. (In a nutshell, both teams filed for a patent, but in between the two filings, the law changed from “first to invent” to “first to file”. Berkeley was the first to submit their claim into the system in 2012, but the law changed in March of 2013 to recognize “first to file” – and when Broad filed later that year, they requested an expedited ruling and effectively leapfrogged the Berkeley submission that was waiting in line per the earlier rules. ( I was going to suggest a moneyed conspiracy theory, but the Congressional Act that created the changes was introduced in early 2011, and spinning that level of web is beyond me . . .))
At the beginning of the research process, the two programs were collegial: in January of 2013 the lead researcher for the Broad wrote to the lead researcher at UC Berkeley, “I met you briefly during my graduate school interview at Berkeley back in 2004 and have been inspired by your work since then.” He continued, “I am sure that we have a lot of synergy and perhaps there are things that would be good collaborate on in the future!” As mentioned previously, the two would also sit together on the board of Editas. Then, rather than share – as universities often do – the lawsuits began.
At the core of the current debate is whether the work that UC Berkeley did was informative enough to cover the work that the Broad did, or, in a great bit of plain-speak, “I’ve given you the recipe for a Western omelet; how good a chef do you have to be to use that to make a soufflé?” Berkeley claims that anyone in the field could make the leap, Broad says only a genius – their genius – could.
What is at stake is more than the future of a business sector that is predicted to grow to be worth some $3.5 billion in revenues in just two years. As a point of comparison, the in vitro fertilization industry took almost 35 years to become a $9+ billion industry. What is at stake is the future of truly “democratic” research that has a tremendous positive impact on the public good.
Research is a process, and a slow one usually. Additionally, to paraphrase another researcher – Isaac Newton – people see farther by standing on the shoulders of those who went before them. Giants or others. Research done in schools and public institutions is validated by peer-review, not investor contributions, and is made very public. (Whether anyone outside of their circles can understand what is being said is another story completely.) Unfortunately, in a world that is rapidly running out of opportunities for Tenbaggers, this kind of research is being commercialized at an alarming rate, and for the financial benefit of very few.
Schools are in the business of creating productive people, as well as contributing to the global body of knowledge. As businesses, they also need to earn “normal profits”. In the truest sense of the word: I know that there are a wide variety of structures for educational institutions, but for the vast majority – the nonprofits – it would seem to me that they need to earn the minimum reward that is sufficient to keep the institution supplying their clients.
Licensing technology is a tremendous source of revenue for many schools, and many need that revenue to earn those normal profits; but many don’t. The total value of the top ten university endowments is $170 billion. Ten schools, and – yes – Harvard and MIT, the two partners in the Broad Institute are on that list at number one and number five, with endowments of $37 and $13 billion respectively.
Research and development cost money, and no one could argue that they aren’t the core of all great universities. These institutions are ongoing concerns, and they need to cover costs – including those that give them a competitive advantage in the future. By the same token, for-profit businesses should also be able to make more than “normal” profits – but how much more?
Beyond these few individuals are a similarly small group of investment firms that control equally vast sums of money. Blackstone, Vanguard and three other funds control more than $14 trillion – and entire sectors of the global economy. They, too, are radically impacting the world of research and development on a corporate level.
These investors, both individual and corporate, cannot continue to seek ever-growing levels of profit without seriously harmful impacts on the global economy – and global welfare. Furthermore, if investors are allowed to infiltrate Academic Research, the answers that the world needs – to heal the sick, to feed the hungry, to heal itself from the ravages of humanity and survive the next hundred years – will literally be sold to the highest bidder.
So, back to where we started: Sustainability.
We have limits on the amount of fish that can be pulled from the sea, because without some fish now there won’t be any in the future. When a company cuts a tree down (with or without someone around to hear it), they plant two in its place to ensure a future supply.
There are no such laws in place on Finance – and it is wholly unsustainable. We cannot let this destructive quest for More invade the halls of Academia.
Putting aside the politics of this argument, I’d like to address the matter (pun intended) at hand: what good is coal?
Coal has been used for thousands of years as a fuel for both light and heat, and it’s use as an industrial fuel dates back to 1000 BCE. (No, I am not proud to have used the Daily Mail as a source.) Here in the United States, it has been used since before there was a United States; but I do love how this article by the US Department of Energy states that coal was “rediscovered” in 1673. What, the Hopi simply “used” coal, but the early Europeans – who knew what it was really all about – “discovered” it? Odd.
Coal is pretty much carbon with couple other elements like nitrogen and sulfur thrown in for variety. It is the result of layers of peat – that matted, layered grass that forms in marshy areas (and makes your Scotch taste like burnt plastic) – pressed and heated over millennia. The more the heat and pressure, the harder the coal.
Jet, a sort of kind of coal, is cut and polished and made into jewelry.
Graphite, on the other end of the spectrum of sort of kind of coal, is used for pencils and writing instruments.
In between (sort of kind of), are the coal materials that burn. Really well. Coal has about twice the BTUs – per unit of weight – than wood. While I doubt that the Hopis were calculating energy efficiency, they clearly were able to see that coal burned hotter, longer, and more consistently than wood. I would also guess that they noticed that it didn’t get eaten by bugs or soaked with water, nor did it take up as much room as wood.
Over time, these factors became more codified and coal was used to power the Industrial Revolution (among other things). Forges, locomotives, steamships, and factories of all kinds burned coal. In the 1880s, we added electricity to the equation.
The Pearl Street Station was opened in 1882 by Thomas Edison (the light bulb wasn’t so useful without electricity), and the plant would pave the way for the future of both electricity and Cogeneration. The PSS used coal to create steam – both to power generators for electricity, but also to provide heat for neighboring buildings. Legions of film directors are forever in his debt . . .
From this initial plant, the worldwide use of coal for the production of electricity has grown – massively. Coal is still used to produce more than 40% of the world’s electricity, consuming around three billion tons of the stuff every year. It’s cheap, and plentiful – so what’s the problem?
The biggest issue is that it’s dirty – and not chipper chimney sweep dirty.
From start to finish, coal makes a mess:
There is literally a mining technique called “Mountaintop Removal Mining“. While that stands on its own as a rather negative superlative, all coal mining is dangerous, labor-intensive, and destructive.
Transport of the material is also not without its problems: it is heavy and requires significant fuel stocks to ship, usually shipped in open containers (coal dust is extremely volatile), and prone to spreading solid waste in the process. It can be, however, rather photogenic (in a depressing way):
While coal makes up around 25% of the total fuel consumption in the world (along with oil, natural gas, hydroelectric, etc.) it is responsible for some 40% of the CO2 emissions from that appetite. Coal-fired plants also generate significant waste, and hazardous waste at that. As these plants are situated near significant sources of water, coal has a very bad history of polluting them.
So, overall, we should be moving away from it as a fuel, and toward cleaner renewables; which brings us back to the miners.
No presidential candidate really wants to tell people that a vote for them will cost them their job. Clinton’s comment was really meant for the environmentalists, but – for the miners – it stings all the same. A lot. Especially when one considers that the average coal miner makes almost DOUBLE what the average of all other workers in the US makes. To dig rocks out of the ground that sell for US$40/ton.
“So,” I hear you ask (again), “what?”
Well, what if there were better uses for this material?
Coal is mostly carbon, and we keep talking about how carbon nano-this and carbon fiber-that are such great things – right? Coal and its byproducts can be used in the creation of these high-strength, heat-resistant materials.
A company called Touchstone is using coal to produce CFOAM®, a highly-insulating material with uses in Aerospace and advanced building systems.
Researchers have also been looking into the creation of carbon fibers from coal tar for more than a decade, and fibers made from this precursor (as opposed to the more common Polyacrylonitrile or PAN) have been found to offer interesting opportunities to designers. They are easily customized, are stiffer, and are more conductive of both heat and electricity.
Perhaps based upon these findings, researchers at MIT have suggested that coal could be instrumental in the creation of a new generation of electronic devices – everything from batteries to solar cells. One of the drivers of the investigations was the fact that coal is so much cheaper than the current material used for these devices, refined silicon.
Could there be a future where large-scale solar arrays and high-efficiency batteries were significantly cheaper, making this renewable a more cost-effective solution – and building that future on the back of new-found coal research? It might be far-fetched, but it points out a key factor in materials utilization.
Coal itself is not a “bad” material. What we do with it is. While there is much to debate in the economy vs. environment scenario, industries that are founded on these pariah materials would do well to look at the potential positive benefits of the material, and look at ways to find new, benign uses for it. The current tactic of lobbying for protections (at best) or threatening would be reformers (at worst), while still serving the gun lobby well, is pure ostrich thinking.
I have nothing against the birds, but if we continue to plead for an environmentally ruinous status quo to preserve the prosperity of a small minority of the world’s population, we will find ourselves at the mouth of a large and dirty river with no paddle.
As now-brunette Ryan Lochte anchors his legacy to That Bathroom Incident, it’s a good time look at that lovable rube, the Obscure Olympics Scandal. From (Bad)minton to really – REALLY – poor cultural judgement, this global stage never misses an opportunity to entertain.
While the aforementioned PR nightmare for the US Swim Team might be hogging the headlines, there has been another tempest in a teapot this summer . . .
Table tennis balls, to be specific.
In 2014, the International Table Tennis Federation made the decision to change from celluloid balls to what were termed Poly-Balls. The rationale for this change has been debated by table-tennis aficionados (both of them) , and has included such explanations as slowing the game down; “leveling the playing field”, which was considered code for “bringing the Chinese down a peg”; and for safety reasons.
That last one took me a bit by surprise: I had never really considered a ping-pong ball much of a hazard at all, and couldn’t imagine what the danger was.
Then I saw this:
So, “old” table tennis balls are – still – made of celluloid, a highly volatile material that is difficult to ship via air due to its flammability. I suppose that I was most interested to see that anything was still made from this material.
Celluloid is among the earliest of plastics, and ushered in an era of – well – plastic products. What has always struck me about the early uses of plastic is the fact that they so clearly predicted their use today. For example:
These toothbrushes date back to the late 1800s, and despite the one on the right clearly being the more “consumer focused” in 1894 (the design was called “La Danse”), the one on the left wouldn’t look out of place on a shelf today. In fact, today you can get them for as low as US$0.70/ea (for 10,000), with your logo on them . . .
This is exactly what plastic is good at today – making large quantities of low-cost products.
Celluloid was also considered a protector of wildlife, as it “replaced” ivory, which – remarkably – gets us back to balls. And table-based sports. Around the same time that people were using the ornate toothbrushes featured above, those same people – the 1% of the 19th Century – were also playing Billiards. No mansion worth that title was without a billiard table.
Billiard balls, at that time, were made of ivory, and the decimation of elephant herds for the genteel sport was not insignificant. In 1867, The New York Times warned of an impending extinction of the animals based upon the growing popularity of the sport – and hoped that a useful substitute would be found. A New York billiard company, Phelan and Collender, supposedly offered a US$10,000 (almost $200,000 in today’s currency) reward to whomever could create a non-ivory ball.
The result – as legend tells it, was the “celluloid” billiard ball developed byJohn Wesley Hyatt. This new material was a further refinement of Parksine, which was nitrocellulose – and considered the first synthesized polymer. Its creator, Alexander Parkes, envisioned it as synthetic ivory, and thought it would take the world by storm. It never did, though Hyatt’s improvements would bring the material into use beyond billiards, including piano keys and false teeth.
The new balls did spare the elephants, but were said to emit a crack like gunfire when they collided (apparently the stories of “exploding” balls are just that – stories). Moreover, they apparently didn’t have a “true” bounce or the durability of the earlier balls, which was a far more serious flaw.
Which brings us back to table tennis.
The ball materials used in the sport have ranged from cork to rubber, and the size of the ball has recently grown from 38mm in diameter to 40. This last change was to make it more visible to the television audience. Regardless of why, there have been changes and the athletes have to adapt.
What is most interesting, however, is the notion that – in this day and age – we are not able to rise to the challenge of a material replacement scheme. The new material is actually rumored to be cellulose diacetate, basically a more stable form of the original material. So, why is it difficult to replicate the “feel” of the original balls?
In today’s competitive world, athletes spend hours – according to one theory, some 10,000 of them – repeating the same strokes, the same movements. If their equipment is inconsistent, then all of that muscle memory and cerebral calibration are completely useless – if not counterproductive.
Plastic resins are among the most malleable materials we have available to us, and among the most researched. All that really comes to mind to explain the inability to find a suitable resin is the fact that this market – possibly less than 15,000 people – does not represent a viable return on the necessary R&D investment? Maybe if they can get the fraternity basement market interested . . .
For now, the players will have to keep leaving the signs of their displeasure for all to see.
When was the last time anybody saw us beating, let’s say, China in a trade deal? They kill us. I beat China all the time. All the time — D. Trump
I’d like to use this opportunity to talk about China . . . and recycling.
The preoccupation with China, and specifically trade with China, in the election has reminded me about one of my favorite Materials Stories: The Minjinyu 5179 Incident. If you’ve heard this one, skip down to here.
I had just moved to Doha in the Fall of 2010, and was charged with fashioning the syllabus for DESI510: Materials and Methods, a new course for the MFA program. Trying to set the stage for “the importance of materials knowledge in not only a design career, but any career”, I was mining the news for current events that would highlight – ideally in some scandalous context – this very notion.
Little did I know what the world had in store for me.
In September of that year a Chinese trawler, the aforementioned Minjinyu 5179, was sailing in the waters off the Senkaku/Diaoyutai Islands, one of many (very many) disputed groups of islands in the Pacific. The trawler, deemed to be fishing in Japanese waters, was approached by Japanese Coast Guard vessels and ordered to heave to (maritime speak for “stop”): it chose, instead, to ram the Japanese vessel and make a futile run for it.
Scandal, ahoy! (And, no, it’s not about the fish.)
Senkaku is the uninhabited islands’ Japanese name, and Diaoyutai is their Chinese name, and they weren’t much more than “turn when you see them” navigational signals for a long time. While the Japanese claim to have “discovered” them in 1884, the Chinese show written records of them dating back to 1403 – though they don’t necessarily say that they wanted them. Then.
In 1895, who found them first kind of went out the window as Japan formally annexed the islands as part of their spoils from the First Sino-Japanese War. A few wars later, in 1945, when Japan surrendered to the United States at the end of World War II, they fell under US occupation but were later returned to Japanese control, with the passage of the Okinawa Reversion Treaty by the US Senate, in 1972. Coincidentally, in that same year, China also claimed ownership of the islands. Curious.
While the islands were mere road signs in 1945, in 1969 they had a Beverly Hillbillies moment when a UN commission identified possible oil and gas deposits in their vicinity, and they were suddenly much more valuable than their previous use as a bonito-processing station. While this was perhaps the origin of China’s renewed interest in the islands, it was far from the end; and the current ownership map looks like this:
So, you’ve got two countries locked in a dispute about territorial ownership, a bunch of boats stepping on each other’s proverbial toes – a diplomatic bruhahah was bound to arise. And it did.
Rare earth elements, or rare earth metals, are things like cerium (Ce), dysprosium (Dy), europium (Eu), gadolinium (Gd), holmium (Ho), neodymium (Nd), praseodymium (Pr), promethium (Pm), samarium (Sm), terbium (Tb), thulium (Tm), and ytterbium (Yb). Apologies if I have edited out your favorite.
While most of us have never heard of dysprosium – much less how to pronounce it – it and its rare siblings are the bedrock of the technology sector. So much so that the Japanese refer to rare earth metals as the “Seeds of Technology”. (Well, that’s what these people say. I wish they were Japanese . . . )
OK, these things are important, and China doesn’t want Japan to have them – so?
At that time, China produced – depending on who you asked – anywhere from 90-99% of all of the known rare earth metals, so if they wouldn’t sell to Japan, it would effectively stop their ability to produce technological products – which one could argue accounted for almost a quarter of the country’s GDP. Game – potentially – over.
Suddenly, stuff like this “dirt” was bringing a country to its knees.
This was DESI510 gold – GOLD!
The Chinese were using materials – esoteric ones at that – to threaten the Japanese with economic ruin. In the end, at least according to my syllabus, the Japanese relented, apologized to the Chinese, and restored the sustaining flow of these raw materials to their industry.
In reality, it appeared that the withholding of materials was a ruse, the Japanese released the captain of the fishing vessel only after the Chinese held their feet to the fire diplomatically, and both they and the Chinese escalated their rhetoric about sovereign domains. World War III was predicted, and rare earth metals were forgotten.
Until a couple weeks ago.
A recent Wall Street Journal article, entitled “China’s Rare Earth Bust”, led with the telling of Honda’s recent development of a hybrid engine that didn’t use any rare earth metals – a pretty prodigious feat. It took this achievement as proof of one of Julian Simon’s “theories” – that “human ingenuity” has the capacity to overcome shortages in natural resources.
I had never heard about Simon, but the concept that high prices or shortages of materials leads to greater innovation is not a new one. We mostly talk about it in terms of energy prices: the higher the price of traditional sources like oil and coal, the more attractive renewable forms of energy – like wind or solar. The point there is that “clean” (but more expensive) technologies benefit – that’s good.
The Journal also made the link to energy but, instead, chose to highlight fracking – fracking – rather than clean energy. This is what fracking looks like when oil prices go back down.
It went on to say that higher rare earth prices led to – surprise – more efforts to mine them, as the returns were better. This is what more mining looks like – in this particular case, in China.
There was one line about recycling: “Metals firms began recycling more lanthanum, dysprosium and other coveted elements from industrial waste.” One line. This would have pleased Mr. Simon, who also famously wrote the following:
“Because we can expect future generations to be richer than we are, no matter what we do about resources, asking us to refrain from using resources now so that future generations can have them later is like asking the poor to make gifts to the rich.” (buy the book here!)
Without belaboring the idiocy of the Journal’s angle in this article, it is the perfect place to return to my original point – recycling – and a question that has been in the air for a long, long time:
Why in the world hasn’t Apple, or any other tech producer, changed their model from a sales-based to a leasing-based one?
We have all (mostly) agreed to bow down to Moore’s Law and consume products like phones and computers at a voracious pace. Just this week, Apple announced that it has sold – drum roll, please – One Billion iPhones.
In my class, I used the following statistics: every one million cellphones recycled would yield 750 pounds of silver, 50 pounds of palladium, 70 pounds of gold, and 35,000 pounds of copper. Using only iPhones, one can simply multiply by one thousand to get the impact – and the value of materials at stake here. In today’s market, 70,000 pounds of gold is worth US$1,373,500,100. Give or take. And, remember, this is just iPhones – according to this site, there is approximately $9.00 worth of gold in every computer.
As my wife is painfully aware, I am a bit of a freak and save all of my old technology. I don’t know, I guess I harbor the fantasy that sometime in the future, the only thing that is going to save us is a TRS-80. Or a 512k Mac. Or a Motorola Star-Tac. Or a Visor Edge (which looks surprisingly like an iPhone). I could go on, but you get the picture.
It was not lost on me, however, that the real value of all of these objects drifts to virtually nothing two years after they roll off the line. We really do feel like they are simply worthless.
In my list of trash antiques above, however, is an omission: the first computer I actually “worked” on, the Original 128k Mac. Why don’t I have that one? The reason is simple: when I graduated, there was an offer to trade in the 128k model for a discount on the new, 512k model; and when the price of a computer – in 1988 – was almost $3,000 (more than $6,000 in today’s dollars), you would take any discount you were offered.
In addition, early PCs – and even early laptops – were modular and upgradeable. My beloved G3 laptop (not purchased because it was codenamed “Lombard”, but it was an added bonus) had an easily removed keyboard so that the user could add RAM or change the processor. Now, we get beautifully honed – but very closed – aluminum cases that allow for no changes to hardware, and fewer and fewer ports to boot.
No ability to upgrade means no future value means the trash heap. Wouldn’t it make more sense to get all of that equipment – and all of the included materials – back? We trade in our cars, why not our technology?
The benefits of leasing equipment would be many. As a materials person, the obvious benefit is control of materials: if you own the phone, you own the materials in it. Play your cards right, and you could reduce your purchases of virgin materials to almost zero. This would also encourage the design of more modular products, ones that would be easy to disassemble and “mine” for parts and materials.
Google, building on Phonebloks, is already venturing in this direction with the Ara. In addition to creating a modular, upgradeable, waste-reducing phone, they are also creating a community. This is a key part of any/all retail strategies.
When I buy a phone – straight up – I own it. When I lease a phone from a wireless carrier, I have a relationship to that carrier NOT the manufacturer of the phone. Additionally, I will most likely want a divorce in a very short period of time (yes, I mean you Verizon).
If I leased a phone from a manufacturer, I would have an ongoing relationship with them, and would be a known customer (remember, it’s cheaper to keep customers than get new ones) and a captive source of information for them. I would no longer hesitate about purchasing an xPhone 24 for fear that the xPhone 25 will be released next month, damning my piddly 24 to immediate obsolescence.
The manufacturer, on the other hand, doesn’t worry about abdicators – unless they turn a deaf ear. They would have a constant line of quantifiable metrics from their community, and could tell which features to keep and which to abandon based on users keeping or returning their equipment.
I am not insinuating that Apple doesn’t make every effort to get this sort of information already, but the change in model would make the whole process much more seamlessly integrated. Information, materials, and customers would flow in a much more closed loop.
I’m not a big fan of Google – ’cause I’m reading this – and I find it interesting that in this case, a hardware scenario rather than a server scenario, I think that they are very much on the right track.
Control your materials: it can be a benefit to you and your customers.