Print Page   |   Contact Us   |   Sign In   |   Register
Complexity Matters
Blog Home All Blogs
The Complexity Matters blog features the Thursday Complexity Post as well as other complexity inspired news items.


Search all posts for:   


Top tags: complexity matters  buscell  health  research  culture  stopMRSA  news  cohn  community  innovation  nature  catching butterflies  MRSA  education  healthcare  neuroscience  medicine  positive deviance  leadership  relationships  resilience  music  science  technology  networks  art  environment  leaders  organizations  ecology 

Future of Artificial Intelligence: Optimistic or Ominous?

Posted By Prucia Buscell, Thursday, December 11, 2014

If computers become smarter than we are, will they just keep us as pets or is civilization doomed? And if artificial intelligence fully surpasses ours, will that change what it means to be human?

Kurt Andersen's Vanity Fair article "Enthusiasts and Skeptics Debate Artificial Intelligence" probes these questions. He starts by asking Siri, the artificial intelligence (AI) app on his iPhone "What is the Singularity? Siri replies, "A technological singularity is a predicted point in the development of a civilization at which technological progress accelerates beyond the ability of present day humans to fully comprehend or predict." Some futurists predict Singularity by 2050. Some brilliant scientists and scholars differ on what that might mean, and possibilities are being explored in popular films.

In the movie "Transcendence," Johnny Depp stars as an AI genius trying to create an omniscient machine that also has a full range of human emotion. Terrorist Luddites poison Depp's character, but his consciousness is uploaded to the cloud leading survivors to wonder whether the disembodied intelligence is really him. The Spike Jonze movie "Her," set in the not too distant future, explores human-machine interaction as a man falls in love with an artificially female and surprisingly fickle computer operating system. Alan Turing, the mathematical genius who founded computer science in the 1930s, is also the subject of a recent movie. "The Imitation Game" is based on Turing's life, his work cracking the Nazis' Engima code, and his fascination with AI. The title is based on Turing's proposal for a test that would provide proof of when computers can pass as humans.

Andersen interviews Ray Kurzweil, who in 2005 wrote The Singularity Is Near, and who is Google's director of engineering, leading a research team trying to create software that would communicate in a fully human way. Kurzweil is also the co-founder, with Peter Diamandis, of Singularity University based at NASA Research Park in Silicon Valley and funded by what Andersen calls a "digital-industrial-complex pantheon: Cisco, Genetech, Nokia, GE, Google." Diamandis is an Ivy League educated entrepreneur whose book Abundance: The Future is Better Than You Think, lays out a utopian future in which technology makes a healthy environment and social harmony possible.

Jaron Lanier, the virtual reality pioneer who now works at Microsoft Research, told Andersen he thinks machines might become convincingly human some time before the end of this century. Philanthropist and Microsoft co-founder Paul Allen wrote in the MIT Technology Review that "The Singularity Isn't Near." Kurzweil disagreed in response. In an interview with Anderson, Allen says neuroscience research shows many complexities of the human brain are still mysterious and the scale and scope of the "known unknowns" remain vast, so he doubts human-capable AI will happen soon. He says the deeper we look into natural systems the more we have to expand our knowledge and theories to characterize their detailed operations, and he calls that the complexity break.

Some scientists worry about technological superiority. Jaan Tallinn, who helped create Kazaa and co-founded Skype, thinks technology may not benefit us if we lose control of its development. Elon Musk, co-founder of Tesla Motors and founder the space travel company Space X, calls AI "our biggest existential threat." 

Lanier's most recent book, Who Owns the Future, is about politics, power, economics and jobs. He's not afraid smart machines will enslave us. He told Andersen he's worried that machine owners-digital big business-will use technology to impoverish and disempower the middle and working classes. Andersen writes that in Lanier's skeptical view, today's big data and mass market AI "amount to a stupendous con." The crowd sourced digital powerhouses, like Google, YouTube and Facebook, Lanier asserts, "are Tom Sawyer, and we're whitewashing their fences for free because they've bedazzled and tricked us into thinking its fun." Read the provocative Vanity Fair article, with thoughtful interviews and commentary, here.

Tags:  buscell  complexity matters  science  technology 

Share |
PermalinkComments (0)

Thought-Controlled Gene Expression

Posted By Prucia Buscell, Wednesday, November 26, 2014

Scientists at ETH Zurich have constructed a networked system in which gene expression can be controlled remotely by human thought, and they hope that eventually thought-controlled brain implants will help combat neurological diseases.

A team of researchers led by Martin Fussenegger implanted a living mouse with designer cells that can be controlled with light. As a story by John Hewett in notes, that's challenging enough, but what they did next is jaw-dropping. Electrical signals from the brain of a human wearing a brain-computer interface (BCI) remotely activated genes in the mouse's brain implant by turning on the light. The mouse implant was wirelessly linked to the human monitor by a Bluetooth device.

A story in The Scientist by Jyoti Madhusoodanan says this achievement is the first time two known technologies, optogenetics, which uses light sensitive protein to control gene expression, and EEG based BCI, which harnesses the brain's electrical potential to create a physical output, have been used this way. Synthetic biologist Timothy Lu at MIT, who was not involved in the research, describes the work as "awesome."

BCIs that capture the electrical neural impulses in the brain have been used in the past to control cursors and prosthetic devices. Fussenegger's team developed a gene-regulation method that enables thought-specific brain waves to control gene expression, which means the conversion of genes into proteins.

A story says one inspiration for the new system was the game Mindflex, in which players wear a sensor on the forehead that records brainwaves that are transferred to the playing environment by EEG. The EEG controls a fan that enables a small ball to be thought-guided through an obstacle course.

Researchers discovered that the state of mind of the human participant regulated the quantity of an experimentally used protein released by the implant into the mouse's blood stream. Human participants were asked play a focused game of Minecraft for 10 minutes, control their brain activity in response to a visual light display, or just relax or meditate. "In all three mental states, the brain produced very specific (electrical) signatures," Fussenegger told The Scientist.

"For the first time, we have been able to tap into human brainwaves, transfer them wirelessly to a gene network, and regulate the expression of a gene depending on the type of thought. Being able to control gene expression via the power of thought is a dream that we've been chasing for over a decade," Fussenegger says in the story.

Eventually, the Extremetech story says, researchers hope the thought controlled implant and the controlling thoughts will exist in one person-or perhaps two appropriately synchronized persons. The idea is that one day someone with a mind-controlled implant might be able to think about something-say you want more adrenaline or more dopamine, or insulin,-- and have the implant dutifully trigger release of whatever chemical is needed.

The extremely complex research that led to this extraordinary breakthrough is described in Nature Communications.

Tags:  buscell  complexity matters  research  science 

Share |
PermalinkComments (0)

Earthquakes, Forest Fires, Stars and Brains

Posted By Prucia Buscell, Thursday, April 10, 2014

Human brain activities that give rise to thinking may be akin to the dynamics of earthquakes, forest fires, the spread of contagious disease, the distribution of galaxies in the universe and the sand in an hourglass.

Flip an hour glass upside down, and sand running into the bottom of the glass forms a pile that eventually becomes so unstable that one more grain can cause the pile to collapse into an avalanche. When that happens, the base of the sand pile flattens out, another pile begins, and then it too reaches a point where it collapses. Through several avalanches of varying sizes, the sand pile maintains overall stability. It's a process Danish-American scientist Per Bak called "self organized criticality."

When he died in 2002, The New York Times described Dr. Bak as an "intellectually pugnacious physicist who sought to understand how complexity arises in the world," and how the simple particles that make up the universe could be transformed into the extraordinarily intricate order found in nature. A story by Jennifer Ouellette in Quanta Magazine and reprinted in the Scientific American, explains that Dr. Bak found an answer in phase transition, the process in which materials pass from one state to another. The phase change of water to steam, for example, depends only on temperature and air pressure. Ouellette explains Dr. Bak proposed phase change in which local interactions among many elements of a complex system could spontaneously self organize to reach the tipping point he called criticality. In a 1987 paper in Physical Review Letters, Dr. Bak and coauthors described self organized criticality as the underlying mechanism behind the flow of rivers, the luminosity of stars, and what happens in sand piles and other dynamical systems. His book How Nature Works expands on the idea.

Neuroscientists didn't immediately embrace Dr. Bak's idea on brain function when he proposed it 15 years ago. In the last decade, however, EEG recordings of the interactions among individual brain neurons, large scale studies comparing computer model predictions and fMRI images, and examinations of slides of cortical tissue, have produced evidence that the brain exhibits properties of criticality. Neurophysiologist Dante Chialvo, from the University of California at Los Angeles, is among the renowned scientists who now think self organized criticality could explain brain activity. The idea is also being explored by national and international research efforts.

Getting back to the hour glass. Ouellette explains that when the sand pile-a complex system with millions of tiny elements-reaches the critical point, there is no way to predict which next grain will cause the avalanche, how big any avalanche will be, or how many there will be before all the sand is in the bottom of the glass. The things you can predict are that the falling of one extremely tiny grain can have a big impact; and that while overall stability of the system is maintained-there's still a pile-and there will be more small avalanches than big ones, in line with what mathematicians call power laws.

The exact moment of transition in a phase change is the critical point when the system is half way between one phase and the next. Each of the tens of billions of neurons in our brains, their connections and their interactions, produce "the emergent process we call thinking," the Quanta article says. It goes on to say that Dr. Bak's idea "implies that most of the time, the brain teeters on the edge of a phase transition, hovering between order and disorder."

Tags:  buscell  complexity matters  nature  neuroscience  science 

Share |
PermalinkComments (0)

What Scientific Ideas Should be Scrapped? Brilliant Thinkers Have Some Suggestions

Posted By Prucia Buscell, Thursday, January 23, 2014

Nicholas Christakis suggests we need to get over our obsession with statistical averages.

Christakis is a physician and social scientist who coauthored the book Connected: The Surprising Power of Our Social Networks and How They Shape Our Lives. He says we've persuaded ourselves that mathematical averages are the most important way to compare things-countries, professions, actions and groups of people. Instead, he says, we need to compare variances, which capture the range or spread of whatever value we're trying to measure.

For instance, he writes, the U.S. and Sweden may have nearly the same average income, but the variance in income-income inequality--is far greater in the U.S. So the variance may have more to do with life in either country than the average. He says a more equal distribution of income might improve the health of the group, and even of individuals within the group, so we might wish for more equality at the expense of wealth. But in some cases, he goes on, inequality might be better. Gathering a crew of 10 sailors, would it be better if they were all equally myopic, or if one had perfect vision and the remaining nine varied in their degree of visual impairment? He says you'd probably choose more inequality in exchange for one with reliable vision.

Christakis is one of the scholars, scientists, thinkers, artist and authors who responded to this year's question: What scientific idea is ready for retirement? Cultural impresario John Brockman, who founded Edge, an online salon for provocative ideas and intellectual debate, has been posing an intriguing open-ended annual question since 1998. Last year's question was "What should we be worried about?" Earlier queries have included such challenges as "what would change everything?" and "what do you believe that you can't prove?" The Guardian calls Edge a forum for the world's most brilliant minds, and the 174 essays submitted so far for this year's query offer a dazzling cognitive buffet.

We make social tradeoffs, Christakis says, and examining variance will help us probe such questions as whether we want a richer, less equal society, whether we want educational programs to increase equality of test scores, or average performance, and even whether cancer patients might prefer a drug that extends lives for some but kills others. Other thinkers have proposed that we jettison current notions of infinity, information overload, big data, cause and effect, free will, and truth.

Eldar Shafir, author and psychology professor at Princeton, would scrap the idea that opposites can't both be right. He says sadness and happiness, stupidity and wisdom and goodness and evil can all co-exist, and context matters. He cites a study in which seminary students, immersed in Biblical and ethical learning, were asked to deliver lecture on the parable of the Good Samaritan. Half were told they were comfortably ahead of schedule, and half were told they were late. On their way to the talk, all encountered a presumably injured man slumped in a doorway groaning. Most of those who thought they had time stopped to help. Among those who thought they were late, only 10 percent tried to help, and the rest stepped over the victim and rushed along.

Nicolas G. Carr, author of The Shallows and The Big Switch, thinks we should ditch "anti-anecdotalism" and recognize that life is made up of those little stories. Science that scorns them risks veering away from the actual experience of life, he asserts, adding "The empirical, if it's to provide anything like a full picture, needs to make room for both the statistical and the anecdotal." To W. Daniel Hillis, a computer scientist with the technology company Applied Minds, the concept of cause and effect is just an artifact of our brains' proclivity for storytelling. He'd let that go. Gary Klein, a psychologist with MacroCognition, thinks the idea of evidence-based medicine impedes progress because it discourages exploration of treatments not tested in randomized controlled trials. He notes patients suffer from far more conditions than can be controlled for. Dean Ornish, founder and president of the nonprofit Preventive Medicine Research Institute, thinks we could sideline large randomized controlled trials. Size doesn't always matter, he says, and a randomized controlled trial "may introduce it own biases." He calls for more creative experimental designs. Click here to view all the essays.

Tags:  buscell  complexity matters  research  science 

Share |
PermalinkComments (0)

Insect Drones and Pig Replacement Parts

Posted By Prucia Buscell, Thursday, March 14, 2013

Drones have been in the news lately, and the next generation of drones may involve an even more controversial and exotic technology. They may be very tiny cyborgs.

Amit Lal, an engineer, Cornell professor and program manager for the Defense Advanced Research Projects Agency (DARPA) wrote a proposal for prospective researchers years ago suggesting if scientists could hack into insect bodies and control their movements, they’d have a real start on small scale flying machines. Michael Maharbiz at the University of California, Berkeley, took up the challenge. He and his team began researching the biology of the Mecynorrhina torquata, at 2-3 inches long, the world’s second largest flower beetle. Its hard shell and size make it capable of carrying a significant amount of cargo, including a "backpack” of electronic gear attached to its back with beeswax. Researchers wired the creature’s brain so it could be steered remotely, and loaded the backpack with a tiny battery, miniature radio receiver and a custom built circuit board.

Emily Anthes describes the race to create insect cyborgs in an article in The Guardian. She is also author of a new book, Frankenstein's Cat: Cuddling Up to Biotech's Brave New Beasts, which contains the story of DARPA’s quest. Anthes quotes Maharbiz as saying the beetlebots, which still haven’t been deployed in the field, will be able to provide intelligence in military operations and save lives in earthquakes by directing rescue teams to humans trapped in ruble. Critics have worried that cyborg beetles could be used to launch germ warfare or spy on civilians, but Maharbiz scoffs at such sinister suggestions. He is now working on a remote controlled cyborg fly, an even more difficult project because of its smaller size and weight. Such cyborg insects could fly into buildings and caves, alerting soldiers and distant observers to the presence of explosives and information to gauge whether human occupants were enemies or civilians.

Bioengineering has advanced dramatically since Dolly the Sheep was cloned in 1996. Listen to Anthes’s interview with Terri Gross on NPR's Fresh Air. Some wealthy people clone their pets-it’s six figures for a dog-and people can buy genetically modified bright colored fish that glow in the dark. But there are more serious endeavors. It’s no longer rare for a human to receive a valve from a pig heart. Scientists are now trying to grow pigs that will produce numerous whole organs for human transplants and goats injected with human genes that can produce protein rich milk with the antibiotic properties of human breast milk. Anthes describes how Chinese scientists are identifying the functions of each gene in the mouse genome by disabling one gene at a time and monitoring how the mutant mice develop. She told Gross that among the lab’s 45,000 mouse cages there are mice with cancer, male pattern baldness, obsessive compulsive disorder, and some that are only able to turn left. The discoveries could eventually help understand genes involved in human diseases and afflictions.

And the ethics of all this? Anthes concedes it’s complex. People might accept an experiment intended to treat cancer more readily than one to prevent baldness, says Anthes. Is it all right to risk harm to animals in the name of research? Unintended consequences can’t be ruled out in experimental work, Anthes says, so researchers have to worry. Scientists successfully engineered leaner, faster growing pigs, she notes, but the pigs were miserable with arthritis, eye problems and other health woes. In a New York Times essay, Anthes describes genetically modified salmon that grow faster because they’ve been engineered to carry the genes of another species, the ocean pout. It soon may be the first transgenic animal in the human food supply. While Anthes agrees with the need for painstaking evaluation, she hopes fear of genetic modification won’t prevent innovative scientific research with potential to help human health.

photo: male and female Mecynorrhina tortuata beetles

Tags:  buscell  complexity matters  nature  science 

Share |
PermalinkComments (0)

Innovative Scientific Study Examines A Partnership Perfected by Practice

Posted By Prucia Buscell, Thursday, August 6, 2009
Updated: Thursday, February 17, 2011
When I bestride him, I
soar, I am a hawk: he trots the air; the earth
sings when he touches it; the basest horn of his
hoof is more musical than the pipe of Hermes.
Dauphin from
Henry V, Act III, Scene VII, by William Shakespeare

The ancient partnership of horse and rider that has inspired warriors, artists and poets, has been something of a mystery to science.

Biologists and physicists have long been intrigued with the way animals manage the rhythmic movements of their running, loping, and galloping. J.A. Scott Kelso, J. Lagarde, both of Florida Atlantic University (FAU), and C. Peham and T. Licka of the University of Veterinary Medicine in Vienna, point out that researchers have used the mathematical theory of weakly coupled oscillators, group symmetry and the concepts of synergistics to further their understanding of animal locomotion.

In their article in Journal of Motor Behavior, "Coordination Dynamics of the Horse~Rider System,"Dr. Kelso and his colleagues report on their innovative research that furthers the topic by examining the horse and rider as an "informationally coupled dynamical system." They explain that means they looked at the mutual interactions between thetwo different and highly complex systems, horse and rider, to find the essential features of functional coordination between them. Visit the FAU Human Brain and Behavior Laboratory for some background on coordination dynamics.

The researchers meticulously measured the motions of horse and rider with the horse trotting on a 12 meter track in an indoor riding arena. One novice rider and one experienced rider, separately riding the same horse, were studied. During measurements, a riding instructor judged performance of horse and rider using guidelines of the Federation Equestrian Internationale.

Dr. Kelso and colleagues found that the expert rider's motions were in continuous phase synchrony with those of the horse, which was not the case with the novice. Further, they say, the difference between the novice and expert riders indicate that phase synchronization is not spontaneous, but is the result of the expert riders' learned capacity to adapt to the motion of the horse. The skilled rider is better able to anticipate and use the tactile information that the rider and horse share through points of contact saddle, reins, stirrups, and the legs of the ride and trunk of the horse. In Shakespeare's telling, the Dauphin, the son of the king of France, was in perfect harmony with his horse, but the French still lost the Battle of Agincourt.

They authors cite research showing similar coordination dynamics have been observed in the study of animal gait patterns, in coordination between sensory and motor systems, between people and in neural circuitry.

"The reported phase synchrony between the horse and rider clearly belongs to a family of processes generic to the organization of complex physical, chemical and biological systems," the authors write. ""But whereas phase synchronization is considered universal and spontaneous in weakly coupled oscillators, it is far from a given here: Our results showed that the coordination dynamics between two such vastly different brain-body systems as a horse and a rider requires practice and training. Stiff and tense movements must become fluid and flexible. Skill in this case requires sensitivity to and anticipation of the horse's motion. It's all about 'feel;' and some people apparently never get it." Little is known, they say, about how this skill is actually developed.

Tags:  buscell  complexity matters  innovation  research  science 

Share |
PermalinkComments (0)

Anxious? You May Sweat But Others Will Empathize

Posted By Prucia Buscell, Thursday, July 23, 2009
Updated: Thursday, February 17, 2011
"Smell is a potent wizard that transports you across thousands of miles and all the years you have lived." Helen Keller, American author and educator, who was blind and deaf.

Scientists and poets have long known that our sense of smell has power to unleash memories. Researchers now find even smells that are not consciously identified can inspire empathy.

Bettina Pause and colleagues at the University of Dusseldorf in Germany collected sweat of 49 students facing oral final exams, and from the same students again as they exercised. Another group of students sniffed the sweat samples while having their brains scanned. When the volunteers sniffed the academic stress sweat, their brain lit up in areas that process social and emotional stimuli and regulate feelings of empathy. The findings, published in the June issue of PLoS ONE report that only half of the participants detected any sweat smell, and none could detect the difference between sports sweat and anxiety sweat.

A New Scientist story explains that researchers think anxiety prompts the release of a chemical that bypasses consciousness and triggers similar feelings in the person who smells it. The scent of sweat from a skydiver, for instance, could induce anxiety in a person who smelled it, even unknowingly.

In his engaging book Proust Was a Neuroscientist, Jonah Lehrer writes that 90 percent of what we think is taste is actually based on what we smell. He tells us that when the great French chef Auguste Escoffier, the inventor of veal stock, began the launched the practice of serving food piping hot, our olfactory senses enhanced gustatory delight as the volatile molecules evaporated into the air.

Our sense of taste and smell embody multiple memories. Lehrer quotes Proust:

"When from a long distant past nothing subsists, after the people are dead, after the things are broke and scattered, taste and smell alone, more fragile but enduring, more unsubstantial, more persistent, more faithful, remain poised for a long time, like souls, remembering, waiting, hoping, amid the ruins of all the rest; and bear unflinchingly, in the tiny and almost impalpable drop of their essence, the vast structure of recollection.

So it is perhaps not surprising an undetected scent can elicit empathy.

Tags:  buscell  complexity matters  research  science 

Share |
PermalinkComments (0)

They Can Run and They Can Hide But Cheetahs Are Still Endangered

Posted By Prucia Buscell, Thursday, July 9, 2009
Updated: Thursday, February 17, 2011
The cheetah has an asymmetrical gait, observes Penny Hudson, and when it gallops "it does different things with either side of its body."

Ms. Hudson is a doctoral student at the Royal Veterinary College (RVC) in the UK, where scientists are trying to figure exactly what makes cheetahs able to run faster than any other known living creature. A BBC news story by Rebecca Morelle quotes Professor Alan Wilson, head of the structure and motion laboratory at RVC as saying, "The Cheetah is fascinating because it can run 50% faster than any other animals we are familiar with, so in terms of understanding what limits how fast you can run, the cheetah is a wonderful animal to study."

Observers have clocked cheetahs sprinting from 64 to 70miles an hour, and some authorities think they might be able to move even faster. They are also incredibly nimble and can accomplish sudden high speed turns in pursuit of prey.

Greyhounds, the hunting dogs bred for speed, can run about 40 miles an hour. The fastest human might be able to get to about 27 miles an hour. As Ms. Hudson explain in the BBC story, a greyhound's speed is thought to be limited by how fast the dog can swing its legs, and human speed is thought to be limited by leg length. The swiftness of the cheetah is a mystery.

Scientists think the cheetah's velocity might be possible because of its flexible spine, its slender long-legged body, its small round head and powerful shoulders, and a long tail used for balance. It is also the only cat that can't retract its claws, so the claws grip the earth like cleats on a baseball shoe.

Professor Wilson and Ms. Hudson are using high speed photography to capture and analyze all the movements on both sides of running cheetahs. They are also using a running track with embedded plates that measure all the forces moving through their legs as they run. How do they get the cheetahs to run? They tie chicken bits to fast moving string and the animals sprint as they would to catch prey in the wild.

Cheetahs are the most endangered cat in Africa. Scientists estimate there were 100,000 cheetahs in Africa and Asia in 1900, and that population has dwindled to an estimated 9,000 to 12,000 in Africa. Ironically, their amazing fleetness can have a down side: they often are so exhausted after their high speed exertions that lions and hyenas can steel their meals right from under their noses before they have recovered. Unlike most big cats, they hunt in daylight, and the tear shaped black streaks from their eyes to their mouth may act antiglare mechanisms. They can get quiet close to prey because their spots camouflage the well in their natural grassy habitats.

Tags:  adaptive  buscell  complexity matters  nature  science 

Share |
PermalinkComments (0)
Association Management Software Powered by YourMembership  ::  Legal