The complexity conundrum

It’s a rare occurrence, but once in a while, night-time TV truly inspires. Last night was such a night; I was watching ‘Particle Fever’, a documentary on the search for the Higgs boson at CERN. Perhaps it doesn’t make fascinating TV for those who are interested in the drinking and flirting habits of teenagers on Greek islands, but I find it infinitely more interesting. After all, this is about pushing the boundaries of fundamental science, the search for the nature of the universe (or multiverse) and, most importantly from my perspective, the way people pull together to realize phenomenally complex technological systems. 

And therein lies a paradox, it seems. It is not a new one per se, or perhaps it was hiding in plain sight, or (most likely) I just overlooked it or failed to put my finger on it. But it definitely seems to be there. And this is it: apparently, the tinier the structures we want to understand or manipulate, the bigger the equipment we need to do it. This may sound trivial, but if you think about it, it isn’t.

Take for example microscopes. If you want to see something on a micrometer scale, a table-top optical microscope will do just fine. You know, the kind that will fit in a 50x50x50cm box. Then, if you progress to a nanometer scale, you need to resort to electron microscopes, which these days start at table-top dimensions too (although they won’t fit the box I just mentioned), but before you know it, you’re looking at one-meter electron columns with detectors sticking out at all sorts of places. And if you get down to angstrom resolution, where it becomes possible to clearly see individual atoms or even the bonds between them, the only option I’m aware of are huge TEMs that wouldn’t even fit in my living room (which, granted, is not really that big).

So in order to see increasingly smaller stuff, we need increasingly bigger toys. The microscope example fits the bill, but so does that of manufacturing equipment for semiconductors, particularly photolithography tools – they started at table-top sizes too back when structures of hundreds of micrometers in size were plenty good enough and they are now down to bulky machines the size of a shipping container in order to print structures of 15nm and smaller. You want to figure out what happens on a scale of one-thousandth the size of the nucleus of an atom? You apparently need a 27km-diameter particle accelerator paired with 15m-diameter detectors. It’s a funny game. The smaller the playing field, the bigger the ball we use to play it.

And it’s not just size that’s increasing as the structures that we see or make get smaller. It’s complexity, too. Complexity is a bit of an abstract concept, but I always think of it as the combination of many elements that interact in many (and often unpredictable) ways. The big size of state of the art electron microscopes, lithography equipment or particle detectors is to a large extent due to the many auxiliary subsystems we need to have the most crucial components do their jobs properly. Massive cooling systems to prevent powerful magnets from melting or stabilization modules that hold a sample or a wafer really still. And all the control systems and computers to make these subsystems behave in line which each other, which is a high-speed balancing act involving thousands of parameters that influence each other.

Increasing complexity is the result of the way we advance technology in most fields. We devise a concept for achieving something, we then find out that the concept is limited in its performance and we start adding provisions to optimize performance. All these provisions come in the form of physical parts, modules or software packages, which are added to the system as the need for improved performance arises. It is in essence an approach of crude force. The alternative would be to devise radically new concepts to achieve the same purpose but with better performance, but apparently, our capability to do so is less than our capability to manage the complexity of adding more and more elements to already complex systems. Only as this complexity really becomes unmanageable, we resort to new concepts – in the hope that this will reset system complexity to a more manageable level.

Back to CERN and the Higgs boson. The effort to demonstrate this particle and validate and expand our understanding of elementary physics has taken about 25 years. It has required the dedication of the international forerunners in the area of experimental physics. The costs for building the particle accelerator and its four detector systems amount to many billions of euros. Suppose the next step in the advance of particle physics is a step in the same direction: even smaller phenomena to be observed and even more complex experiments to be set up. In that scenario, it is questionable if the particle physics community can pull it off. Will there be sufficient alignment between myriad research institutes, funding parties and individual scientists to choose that one, gargantuan project? Will they be able to amass the required manpower, capital and technological knowledge to bring the experiment to completion within a sort of feasible timeframe? Doubtful, I’d say.

It seems more likely that at this point, new branches will grow from the current trajectory. Perhaps more low-hanging fruit can be identified and pursued – hopefully, the current CERN experiment will provide clues, many of which could be pursued at a lower cost and by smaller parts of the scientific community. Or perhaps someone will break with the apparently human tendency of attacking increasingly difficult challenges with increasingly complex approaches. To me, this seems like the true challenge: to restructure problems in such a way that we can tackle them in a relatively simple way. Although I suppose that phenomenally complex machines will always remain fascinating.

2 thoughts on “The complexity conundrum

    1. Herman Budde

      This is ultimately the limitation of any species in science. We are barely capable of detecting the existence of a higgs-boson let alone neutrinos. Even the age, content and dimensions of the known universe are under continuous debate especially when observations and theory mismatch.

      Luckily we humans are an inventive species. Particles that were theoretical only a few decades ago, like atoms, electrons and photons can now be manipulated with tolerances that are only a fraction of their dimensions. And where in the past the only way to pierce further in the sky was with bigger telescopes we can now launch vast arrays of relatively cheap satellites. Or combine the observations of all ground based ones through mathematical models.

      So I predict that although it took only 25 years to build a particle accelerator to detect higgs-boson, it’s going to take at least twice as long to detect the particles that make up the higgs-boson. But by that time manipulating quarks will be as commonplace as manipulating atoms, electrons and photons now. Manipulating and detecting atoms, electrons and photons with sub-nanometer precision will be in every day technology, like quantum-flash memories and electro-optical quantum processors.

      Reply

Leave a Reply to Hans Cancel reply

Your email address will not be published. Required fields are marked *