Elegant mathematical model universes

Are elegant mathematical models of the universe more important than empirical observed modelling of our solar system, galaxies and universe?

It’s a beautiful theory: the standard model of cosmology describes the universe using just six parameters. But it is also strange. The model predicts that dark matter and dark energy – two mysterious entities that have never been detected – make up 95% of the universe, leaving only 5% composed of the ordinary matter so essential to our existence.

Elegant maths model universes

The idea that fundamental particles are actually tiny bits of vibrating string was taking off, and by the mid-1980s, string theory had lassoed the imaginations of many leading physicists. The idea is simple: just as a vibrating violin string gives rise to different notes, each string’s vibration foretells a particle’s mass and behavior. The mathematical beauty was irresistible and led to a swell of enthusiasm for string theory as a way to explain not only particles but the universe itself.
Gravity is mathematically relatable to dynamics of subatomic particles | phys.org

It’s embarrassing, but astrophysicists are the first to admit it. Our best theoretical model can only explain 5% of the universe. The remaining 95% is famously made up almost entirely of invisible, unknown material dubbed dark energy and dark matter.
Bizarre ‘dark fluid’ with negative mass could dominate the universe | phys.org

Mathematical universe hypothesis

Our external physical reality is a mathematical structure. That is, the physical universe is not merely described by mathematics, but is mathematics (specifically, a mathematical structure). Mathematical existence equals physical existence, and all structures that exist mathematically exist physically as well. Observers, including humans, are self-aware substructures (SASs). In any mathematical structure complex enough to contain such substructures, they will subjectively perceive themselves as existing in a physically ‘real’ world.

The theory can be considered a form of Pythagoreanism or Platonism in that it proposes the existence of mathematical entities; a form of mathematical monism in that it denies that anything exists except mathematical objects; and a formal expression of ontic structural realism.

Tegmark claims that the hypothesis has no free parameters and is not observationally ruled out. Thus, he reasons, it is preferred over other theories-of-everything by Occam’s Razor. Tegmark also considers augmenting the MUH with a second assumption, the computable universe hypothesis (CUH), which says that the mathematical structure that is our external physical reality is defined by computable functions.
Mathematical universe hypothesis | wikipedia

Dark fluid with negative mass and negative gravity


No matter how physically and logically preposterous a proposed universe construct is, to the man in the street, as long as there are no obvious peer reviewed alternatives, then mathematical scientists will at least not be able to be rule it out.
In an essay for The Conversation, Farnes concedes that the negative mass theory could be incorrect – but also expresses hope that, if it’s borne out by future observations, it could provide a new model for explaining the mysteries of the cosmos.

Despite these efforts, a negative mass cosmology could be wrong, he wrote. The theory seems to provide answers to so many currently open questions that scientists will — quite rightly — be rather suspicious. However, it is often the out-of-the-box ideas that provide answers to longstanding problems. The strong accumulating evidence has now grown to the point that we must consider this unusual possibility.
An Oxford Scientist May Have Solved the Mystery of Dark Matter

You can mathematically propose anything and as long as it is not currently falsifiable, it has to be considered as an alternative framework. The more elegant mathematics it uses to possibly explain things, the more popular it might be with those who live in alternative maths universes.

In this new theory, the negative mass particles are continuously created, so the particles are always replenished as the universe expands, he explained. In this new approach, these continuously – created negative masses seem to be identical to dark energy. By combining negative mass and matter creation, dark matter and dark energy can be unified into one single substance – a dark fluid.

One of the reasons we know dark matter exists is its gravitational influence over galaxies. Observations show galaxies are spinning far faster than they should—so fast they should be torn apart. Dark matter, it appears, is helping to hold them together. To test his theory, Farnes created a 3D computer model of his dark fluid to see if it would hold a galaxy together. And it did. “The new model has been tested using a simulation of the universe within a computer, and seems to naturally generate dark matter halos around ‘positive mass’ galaxies. This is a direct observational expectation of dark matter, and so seems to indicate that the model has promise. However, there is still much work to be done to test this idea further.

Farnes says are limitations to the research: The current model provides no explanation at all for the particle physics that may make negative masses possible, he said. This is currently half of all known physics that is not being included into my model!

However, he also says the nature of mass is poorly understood in particle physics, so ideas of negative mass could be incorporated to explain other scientific conundrums.

Alex Murphy, Professor of Nuclear & Particle Astrophysics at the U.K.’s University of Edinburgh, who was not involved in the study, said the findings are interesting: “It’s one of many efforts trying to provide answers to deeply troubling issues with our understanding of the contents of the universe,” he told Newsweek. “The key result is that if there is the right amount of negative mass matter in the universe, then one can explain the observed motions and distributions of galaxies that otherwise require dark matter and dark energy to exist. That is quite elegant.”
most of the universe is missing — a ‘dark fluid’ with negative mass could explain why | Newsweek

The creator of the field of cosmology, Albert Einstein, did – along with other scientists including Stephen Hawking – consider negative masses. In fact, in 1918 Einstein even wrote that his theory of general relativity may have to be modified to include them.

Despite these efforts, a negative mass cosmology could be wrong. The theory seems to provide answers to so many currently open questions that scientists will – quite rightly – be rather suspicious. However, it is often the out-of-the-box ideas that provide answers to longstanding problems. The strong accumulating evidence has now grown to the point that we must consider this unusual possibility.
Bizarre ‘dark fluid’ with negative mass could dominate the universe | The Conversation

Non Euclidean space-time


According to Albert Einstein’s theory of special relativity, instantaneous action at a distance violates the relativistic upper limit on speed of propagation of information. If one of the interacting objects were to suddenly be displaced from its position, the other object would feel its influence instantaneously, meaning information had been transmitted faster than the speed of light.

One of the conditions that a relativistic theory of gravitation must meet is that gravity is mediated with a speed that does not exceed c, the speed of light in a vacuum. From the previous success of electrodynamics, it was foreseeable that the relativistic theory of gravitation would have to use the concept of a field, or something similar.

This has been achieved by Einstein’s theory of general relativity, in which gravitational interaction is mediated by deformation of space-time geometry. Matter warps the geometry of space-time, and these effects are – as with electric and magnetic fields – propagated at the speed of light. Thus, in the presence of matter, space-time becomes non-Euclidean, resolving the apparent conflict between Newton’s proof of the conservation of angular momentum and Einstein’s theory of special relativity.
Einstein – Action at a distance | wikipedia

Dark Matter Hurricanes

But it just may cause some local spikes in dark matter, which would help researchers hunting dark matter actually find the stuff, the researchers wrote.

That’s because all galaxies, but especially dwarf galaxies, are held together by dark matter, physicists believe. So, the galaxy that was torn to shreds birthing the S1 stream likely dumped a bunch of dark matter into the stream’s path.

The problem is, no existing dark matter-detection devices have actually worked, in part because they’ve all been designed based on educated guesses as to what dark matter really is. (Scientists have very good reason to believe dark matter exists but are still guessing about its composition.)
Do Not Fear the Dark Matter Hurricane (The Dark Matter Hurricane Is Good)

Strings of gravity particles

The key insight is that gravity, the force that brings baseballs back to Earth and governs the growth of black holes, is mathematically relatable to the peculiar antics of the subatomic particles that make up all the matter around us.

This revelation allows scientists to use one branch of physics to understand other seemingly unrelated areas of physics. So far, this concept has been applied to topics ranging from why black holes run a temperature to how a butterfly’s beating wings can cause a storm on the other side of the world.

Meanwhile, the idea that fundamental particles are actually tiny bits of vibrating string was taking off, and by the mid-1980s, “string theory” had lassoed the imaginations of many leading physicists. The idea is simple: just as a vibrating violin string gives rise to different notes, each string’s vibration foretells a particle’s mass and behavior. The mathematical beauty was irresistible and led to a swell of enthusiasm for string theory as a way to explain not only particles but the universe itself…

The breakthrough in the late 1990s was that mathematical calculations of the edge, or boundary, of this anti-de Sitter space can be applied to problems involving quantum behaviors of subatomic particles described by a mathematical relationship called conformal field theory (CFT). This relationship provides the link, which Polyakov had glimpsed earlier, between the theory of particles in four space-time dimensions and string theory in five dimensions. The relationship now goes by several names that relate gravity to particles, but most researchers call it the AdS/CFT (pronounced A-D-S-C-F-T) correspondence.


Copy from source: http://www.everythingselectric.com/eie-73/
Gravity is mathematically relatable to dynamics of subatomic particles | www.phys.org

SpiNNaker, the Million-Core Supercomputer, Finally Switched On

After 12 years in the making, the “brain computer” designed at the University of Manchester is finally switched on. What does this computer do? How is it made? And who is Steve Furber?

AI systems have been rapidly developed in the past decade with the use of deep learning, neural networks, and large computers to try and simulate neurons. But AI is not the only area of interest when using such techniques; scientists and engineers alike are also keen to try and simulate the human brain to better understand how it works and why.

Simulating the brain is no trivial task. The complexity of the human brain is difficult to replicate, which is part of why the SpiNNaker computer is important.

The Challenges of Simulating a Brain

One of the first fundamental differences between the brain and computers is how their “smallest units” function. Brain neurons can have multiple connections and react to impulses in a range of different ways. Computer transistors, by comparison, are switches that, while can be connected to other transistors, can only have one of two states.

Neurons are also able to forge links between other neurons and react to stimuli differently (which is one definition of “learning”), whereas transistor connections are fixed.

Because of these differences, scientists have to “simulate” neurons and connections in software rather than in hardware, which severely impacts the number of neurons and links that can be simulated simultaneously.

What about simulation neurons in hardware?

Neurons and transistors share little in common but a better comparison would be simple microcontrollers and FPGAs; microcontrollers are akin to neurons in that they can process outside signals quickly while being comparatively simple in architecture while FPGAs provide the ability to break and create connections between microcontrollers.

Could hardware simulation be the key? One team of researchers believes so and has spent the last 12 years on the idea.

The SpiNNaker

A research team at the University of Manchester have spent the last 12 years creating a computer that will simulate neurons and connections with the use of many simple cores all interconnected on a massive parallel system and the computer, called SpiNNaker, was finally turned on.

The million-core computer is designed to simulate up to a billion neurons in real-time to allow scientists to study neural networks and pathways in a realistic manner by using hardware as opposed to software.

Unlike traditional methods for simulating neurons, SpiNNaker has individual processors that each simulate up to 1000 neurons that transmit and receive small packets of data to and from many other neurons simultaneously.

Hexagonal topology between processors and a 48-processor SpiNNaker 
computer - Image courtesy University of Manchester

The Spiking Neural Network Architecture system (SpiNNaker) consists of 10 19-inch computer racks with each rack containing 100,000 ARM cores. This core density is achieved with the use of a custom IC that contain up to 18 cores. Each board in a rack has 48 chips, which results in each board containing 864 processors.

Unlike typical software systems, the cores are arranged in a hexagonal pattern with data transmission handled entirely in hardware. It is this topology that allows for the system to simulate one billion neurons in real-time. The system uses ARM9 processors containing a total of 7TB of RAM and 57K nodes while each processor has an off-die 128MB SDRAM and each core has 32KB ROM and 64KB data tightly-coupled memory DTCM …

https://www.allaboutcircuits.com/news/simulate-human-brain-spinnaker-million-core-computer-switched-on/

https://www.research.manchester.ac.uk/portal/files/60826558/FULL_TEXT.PDF

3D-printed Deep Learning neural network uses light instead of electrons

It’s a novel idea, using light diffracted through numerous plates instead of electrons. And to some, it might seem a little like replacing a computer with an abacus, but researchers at UCLA have high hopes for their quirky, shiny, speed-of-light artificial neural network.

Coined by Rina Dechter in 1986, Deep Learning is one of the fastest-growing methodologies in the machine learning community and is often used in face, speech and audio recognition, language processing, social network filtering and medical image analysis as well as addressing more specific tasks, such as solving inverse imaging problems.

Traditionally, deep learning systems are implemented on a computer to learn data representation and abstraction and perform tasks, on par with or better than – the performance of humans. However the team led by Dr. Aydogan Ozcan, the Chancellor’s Professor of electrical and computer engineering at UCLA, didn’t use a traditional computer set-up, instead choosing to forgo all those energy-hungry electrons in favor of light waves. The result was its all-optical Diffractive Deep Neural Network (D2NN) architecture.

The setup uses 3D-printed translucent sheets, each with thousands of raised pixels, which deflect light through each panel in order to perform set tasks. By the way, these tasks are performed without the use of any power, except for the input light beam.

The UCLA team’s all-optical deep neural network – which looks like the guts of a solid gold car battery – literally operates at the speed of light, and will find applications in image analysis, feature detection and object classification. Researchers on the team also envisage possibilities for D2NN architectures performing specialized tasks in cameras. Perhaps your next DSLR might identify your subjects on the fly and post the tagged image to your Facebook timeline.

view gallery

“Using passive components that are fabricated layer by layer, and connecting these layers to each other via light diffraction created a unique all-optical platform to perform machine learning tasks at the speed of light,” said Dr. Ozcan.

For now though, this is a proof of concept, but it shines a light on some unique opportunities for the machine learning industry.

The research has been published in the journal Science.

[Sources]
https://newatlas.com/diffractive-deep-neural-network-uses-light-to-learn/55718/
http://innovate.ee.ucla.edu/
https://arxiv.org/abs/1804.08711

How to predict the side effects of millions of drug combinations.

Doctors have no idea, but Stanford University computer scientists have figured it out, using artificial intelligence

July 11, 2018

An example graph of polypharmacy side effects derived from genomic and patient population data, protein–protein interactions, drug–protein targets, and drug–drug interactions encoded by 964 different polypharmacy side effects. The graph representation is used to develop Decagon. (credit: Marinka Zitnik et al./Bioinformatics)

Millions of people take up to five or more medications a day, but doctors have no idea what side effects might arise from adding another drug.*

Now, Stanford University computer scientists have developed a deep-learning system (a kind of AI modeled after the brain) called Decagon** that could help doctors make better decisions about which drugs to prescribe. It could also help researchers find better combinations of drugs to treat complex diseases.

The problem is that with so many drugs currently on the U.S. pharmaceutical market, “it’s practically impossible to test a new drug in combination with all other drugs, because just for one drug, that would be five thousand new experiments,” said Marinka Zitnik, a postdoctoral fellow in computer science and lead author of a paper presented July 10 at the 2018 meeting of the International Society for Computational Biology.

With some new drug combinations (“polypharmacy”), she said, “truly we don’t know what will happen.”

How proteins interact and how different drugs affect these proteins

So Zitnik and associates created a network describing how the more than 19,000 proteins in our bodies interact with each other and how different drugs affect these proteins. Using more than 4 million known associations between drugs and side effects, the team then designed a method to identify patterns in how side effects arise, based on how drugs target different proteins, and also to infer patterns about drug-interaction side effects.***

Based on that method, the system could predict the consequences of taking two drugs together.

To evaluate the The research was supported by the National Science Foundation, the National Institutes of Health, the Defense Advanced Research Projects Agency, the Stanford Data Science Initiative, and the Chan Zuckerberg Biohub. system, the group looked to see if its predictions came true. In many cases, they did. For example, there was no indication in the original data that the combination of atorvastatin (marketed under the trade name Lipitor among others), a cholesterol drug, and amlopidine (Norvasc), a blood-pressure medication, could lead to muscle inflammation. Yet Decagon predicted that it would, and it was right.

In the future, the team members hope to extend their results to include more multiple drug interactions. They also hope to create a more user-friendly tool to give doctors guidance on whether it’s a good idea to prescribe a particular drug to a particular patient, and to help researchers developing drug regimens for complex diseases, with fewer side effects.

Ref.: Bioinformatics (open access). Source: Stanford University.

* More than 23 percent of Americans took three or more prescription drugs in the past 30 days, according to a 2017 CDC estimate. Furthermore, 39 percent over age 65 take five or more, a number that’s increased three-fold in the last several decades. There are about 1,000 known side effects and 5,000 drugs on the market, making for nearly 125 billion possible side effects between all possible pairs of drugs. Most of these have never been prescribed together, let alone systematically studied, according to the Stanford researchers.

** In geometry, a decagon is a ten-sided polygon.

*** The research was supported by the National Science Foundation, the National Institutes of Health, the Defense Advanced Research Projects Agency, the Stanford Data Science Initiative, and the Chan Zuckerberg Biohub.

source:   KurzweilAi.net , Stanford.edu