Month: January 2018

Using physics, math and models to fight cancer drug resistance

Despite the increasing effectiveness of breast cancer treatments over the last 50 years, tumors often become resistent to the drugs used. While drug combinations could be part of the solution to this problem, their development is very challenging. In this blog post Jorge Zanudo explains how it is possible to combine physical and mathemathical models with clinical and biological data to determine which drug combinations would be most effective in breast cancer therapy.

Source: blogs.springeropen.com

Workshop “Stochastic models in ecology and evolutionary biology”, 5-7th April 2018, Venice. Registration Open!

Living systems are characterized by the emergence of recurrent dynamical patterns at all scales of magnitude. Self-organized behaviors are observed both in large communities of microscopic components – like neural oscillations and gene network activity – as well as on larger levels – as predator-prey equilibria to name a few. Such regularities are deemed to be universal in the sense they are due to common mechanisms, independent of the details of the system. This belief justifies investigation through quantitative models able to grasp key features while disregarding inessential complications. The attempt of modeling such complex systems leads naturally to consider large families of microscopic identical units. Complexity and self-organization then arise on a macroscopic scale from the dynamics of these minimal components that evolve coupled by interaction terms. Within this scenario, probability theory and statistical mechanics come into play very soon. Aim of the workshop is to bring together scientists with different background – biology, physics and mathematics – interested in stochastic models in ecology and evolutionary biology, to discuss issues and exchange ideas. A partial list of topics includes: stochastic population dynamics, branching processes, interacting particle systems and statistical mechanics models in ecology, robustness and adaptability of ecosystems, resilience and criticality of ecological systems, models and prediction of biodiversity, molecular evolution, and neuroscience.
The style of the workshop will be rather informal. The idea is to have the opportunity to freely share ideas and discuss. Talks will be organised in different thematic sessions, and we will have both colloquia and more technical presentations.

Source: www.pd.infn.it

A framework for designing compassionate and ethical artificial intelligence and artificial consciousness

Intelligence and consciousness have fascinated humanity for a long time and we have long sought to replicate this in machines. In this work we show some design principles for a compassionate and conscious artificial intelligence. We present a computational framework for engineering intelligence, empathy and consciousness in machines. We hope that this framework will allow us to better understand consciousness and design machines that are conscious and empathetic. Our hope is that this will also shift the discussion from a fear of artificial intelligence towards designing machines that embed our cherished values in them. Consciousness, intelligence and empathy would be worthy design goals that can be engineered in machines.

 

Banerjee S. (2018) A framework for designing compassionate and ethical artificial intelligence and artificial consciousness. PeerJ Preprints 6:e3502v2 https://doi.org/10.7287/peerj.preprints.3502v2

Source: peerj.com

Improving public transportation systems with self-organization: A headway-based model and regulation of passenger alighting and boarding

The equal headway instability—the fact that a configuration with regular time intervals between vehicles tends to be volatile—is a common regulation problem in public transportation systems. An unsatisfactory regulation results in low efficiency and possible collapses of the service. Computational simulations have shown that self-organizing methods can regulate the headway adaptively beyond the theoretical optimum. In this work, we develop a computer simulation for metro systems fed with real data from the Mexico City Metro to test the current regulatory method with a novel self-organizing approach. The current model considers overall system’s data such as minimum and maximum waiting times at stations, while the self-organizing method regulates the headway in a decentralized manner using local information such as the passenger’s inflow and the positions of neighboring trains. The simulation shows that the self-organizing method improves the performance over the current one as it adapts to environmental changes at the timescale they occur. The correlation between the simulation of the current model and empirical observations carried out in the Mexico City Metro provides a base to calculate the expected performance of the self-organizing method in case it is implemented in the real system. We also performed a pilot study at the Balderas station to regulate the alighting and boarding of passengers through guide signs on platforms. The analysis of empirical data shows a delay reduction of the waiting time of trains at stations. Finally, we provide recommendations to improve public transportation systems.

 

Carreón G, Gershenson C, Pineda LA (2017) Improving public transportation systems with self-organization: A headway-based model and regulation of passenger alighting and boarding. PLoS ONE 12(12): e0190100. https://doi.org/10.1371/journal.pone.0190100

Source: journals.plos.org

A review of “Pathway Complexity”, a measure claimed to be able to distinguish life from non-life unlike no other measure before

A paper recently published in the Philosophical Transactions of the Royal Society A under the title “A probabilistic framework for identifying biosignatures using Pathway Complexity” claims to offer a revolutionary measure potentially capable of distinguishing life from non-life and even discerning life on other planets by finding biosignatures. The method proposed by its authors consists roughly in finding the generative grammar behind an object and then counting the number of steps needed to generate said object from its compressed form. Unfortunately, this does not amount to a new measure of complexity. The first part of the algorithm is mostly a description of Huffman’s coding algorithm (see ref. below) and represents the way in which most popular lossless compression algorithms are implemented: finding the building blocks that can best compress a string by minimising redundancies, decomposing it into the statistically smallest collection of components that reproduce it without loss of information by traversing the string and finding repetitions.

Source: goo.gl