recentpopularlog in

tsuomela : modeling   83

« earlier  
Storm -- A Modern Probabilistic Model Checker -- Home
"Storm is a tool for the analysis of systems involving random or probabilistic phenomena. Given an input model and a quantitative specification, it can determine whether the input model conforms to the specification. It has been designed with performance and modularity in mind. "
statistics  modeling 
november 2018 by tsuomela
Home | Pardee Center for International Futures
"A MODEL FOR GLOBAL FORECASTING At the Frederick S. Pardee Center for International Futures, our mission is to explore, understand and shape alternative futures of global change and human development. As part of this pursuit, we have built the International Futures (IFs) model, the most sophisticated and comprehensive forecasting modeling system available to the public. IFs uses our best understanding of global systems to produce forecasts for 186 countries to the year 2100. Our team, based at the Josef Korbel School of International Studies at the University of Denver, works with organizations around the world who share our interest in long-term, integrated analysis of development, security, and sustainability issues."
futures  modeling  global  forecasting 
july 2018 by tsuomela
Zelig
"Zelig is an easy-to-use, free, open source, general purpose statistics program for estimating, interpreting, and presenting results from any statistical method. Zelig turns the power of R, with thousands of open source packages — but with free ranging syntax, diverse examples, and documentation written for different audiences — into the same three commands and consistent documentation for every method. Zelig uses R code from many researchers, making it "everyone’s statistical software." We hope it becomes everyone’s statistical software for applications too, as we designed it so anyone can use it or add their methods to it. We aim for Zelig to be the best way to do analysis, prepare replication files, learn new methods, or teach."
r  statistics  modeling 
april 2017 by tsuomela
Peter Turchin Home - Peter Turchin
"Peter Turchin is a scientist and an author who wants to understand how human societies evolve, and why we see such a staggering degree of inequality in economic performance and effectiveness of governance among nations (see Research Interests). Peter’s approach to answering these questions blends theory building with the analysis of data. He is the founder of a new transdisciplinary field of Cliodynamics, which uses the tools of complexity science and cultural evolution to study the dynamics of historical empires and modern nation-states."
people  research  complexity  ecology  population  modeling 
january 2017 by tsuomela
Modelica and the Modelica Association — Modelica Association
"Modelica® is a non-proprietary, object-oriented, equation based language to conveniently model complex physical systems containing, e.g., mechanical, electrical, electronic, hydraulic, thermal, control, electric power or process-oriented subcomponents."
modeling  programming  open-source  language 
november 2016 by tsuomela
Human and nature dynamics (HANDY): Modeling inequality and use of resources in the collapse or sustainability of societies | SESYNC
"There are widespread concerns that current trends in resource-use are unsustainable, but possibilities of overshoot/collapse remain controversial. Collapses have occurred frequently in history, often followed by centuries of economic, intellectual, and population decline. Many different natural and social phenomena have been invoked to explain specific collapses, but a general explanation remains elusive. In this paper, we build a human population dynamics model by adding accumulated wealth and economic inequality to a predator–prey model of humans and nature. The model structure, and simulated scenarios that offer significant implications, are explained. Four equations describe the evolution of Elites, Commoners, Nature, and Wealth. The model shows Economic Stratification or Ecological Strain can independently lead to collapse, in agreement with the historical record. The measure “Carrying Capacity” is developed and its estimation is shown to be a practical means for early detection of a collapse. Mechanisms leading to two types of collapses are discussed. The new dynamics of this model can also reproduce the irreversible collapses found in history. Collapse can be avoided, and population can reach a steady state at maximum carrying capacity if the rate of depletion of nature is reduced to a sustainable level and if resources are distributed equitably."
paper  research  environment  history  collapse  modeling 
september 2016 by tsuomela
Modsti | Modeling Science, Technology & Innovation Conference
"This NSF-funded, agenda-setting conference will review opportunities and challenges associated with the usage of mathematical, statistical, and computational models in science, technology, and innovation (STI) in decision making. Among others, STI models can be employed to simulate the diffusion of ideas and experts, to estimate the impact of population explosion and aging, to explore alternative funding schemas, or to communicate the probable outcomes of different policy decisions."
conference  science  policy  modeling 
june 2016 by tsuomela
[1403.3568] Modeling Social Dynamics in a Collaborative Environment
"Wikipedia is a prime example of today's value production in a collaborative environment. Using this example, we model the emergence, persistence and resolution of severe conflicts during collaboration by coupling opinion formation with article editing in a bounded confidence dynamics. The complex social behavior involved in editing articles is implemented as a minimal model with two basic elements; (i) individuals interact directly to share information and convince each other, and (ii) they edit a common medium to establish their own opinions. Opinions of the editors and that represented by the article are characterised by a scalar variable. When the pool of editors is fixed, three regimes can be distinguished: (a) a stable mainstream article opinion is continuously contested by editors with extremist views and there is slow convergence towards consensus, (b) the article oscillates between editors with extremist views, reaching consensus relatively fast at one of the extremes, and (c) the extremist editors are converted very fast to the mainstream opinion and the article has an erratic evolution. When editors are renewed with a certain rate, a dynamical transition occurs between different kinds of edit wars, which qualitatively reflect the dynamics of conflicts as observed in real Wikipedia data."
preprint  research  modeling  agent-based-model  wikipedia  collaboration  mass-collaboration 
november 2014 by tsuomela
Text Analysis with Topic Models for the Humanities and Social Sciences — Text Analysis with Topic Models for the Humanities and Social Sciences
"Text Analysis with Topic Models for the Humanities and Social Sciences (TAToM) consists of a series of tutorials covering basic procedures in quantitative text analysis. The tutorials cover the preparation of a text corpus for analysis and the exploration of a collection of texts using topic models and machine learning."
text-analysis  topics  digital-humanities  modeling  semantics 
august 2014 by tsuomela
[1212.0018] Evidence for Non-Finite-State Computation in a Human Social System
"Finite-State Machines are a basic model of computation, forming one of the simplest classes in the computational hierarchy. When given a probabilistic transition structure, they are one of the most common methods for description and prediction of symbolic time-series in the biological and social sciences. Here we show how a generalization of a central result for finite-state machines, the pumping lemma, to the probabilistic case, leads to a crucial constraint: sufficiently long sequences will be exponentially suppressed for finite-state processes. We apply the probabilistic pumping lemma to an analysis of behavioral patterns in the distributed, open-source Wikipedia community to demonstrate strong evidence for the emergence of functional powers over and above the regular grammars, and provide evidence to associate these with fundamentally interpersonal and social phenomena."
wikipedia  collaboration  modeling  computer-science  formal  complexity  self-organized 
april 2013 by tsuomela
Bonacich, P. and Lu, P.: Introduction to Mathematical Sociology.
"Mathematical models and computer simulations of complex social systems have become everyday tools in sociology. Yet until now, students had no up-to-date textbook from which to learn these techniques. Introduction to Mathematical Sociology fills this gap, providing undergraduates with a comprehensive, self-contained primer on the mathematical tools and applications that sociologists use to understand social behavior."
sociology  mathematics  modeling 
march 2013 by tsuomela
Mapping the Origins and Expansion of the Indo-European Language Family
"There are two competing hypotheses for the origin of the Indo-European language family. The conventional view places the homeland in the Pontic steppes about 6000 years ago. An alternative hypothesis claims that the languages spread from Anatolia with the expansion of farming 8000 to 9500 years ago. We used Bayesian phylogeographic approaches, together with basic vocabulary data from 103 ancient and contemporary Indo-European languages, to explicitly model the expansion of the family and test these hypotheses. We found decisive support for an Anatolian origin over a steppe origin. Both the inferred timing and root location of the Indo-European language trees fit with an agricultural expansion from Anatolia beginning 8000 to 9500 years ago. These results highlight the critical role that phylogeographic inference can play in resolving debates about human prehistory."
languages  linguistics  modeling  bayes  statistics  geography  anthropology 
august 2012 by tsuomela
Nutonian Inc.
We develop advanced scientific data mining technologies that uncover deep mathematical relationships hidden in your data. Whether your goal is deeper insight, better prediction, or faster optimization, formulize your data to discover its underlying hidden patterns.
science  mathematics  modeling  statistics  regression  software  cloud  windows 
june 2012 by tsuomela
NARCCAP: The North American Regional Climate Change Assessment Program
NARCCAP is an international program that serves the high resolution climate scenario needs of the United States, Canada, and northern Mexico, using regional climate model, coupled global climate model, and time-slice experiments.
climate  environment  climate-change  impact  modeling  science  computational-science 
june 2012 by tsuomela
Brett Keller » Hunger Games survival analysis
"As a student of epidemiology and economics I feel duty-bound to apply my cursory knowledge of statistics to the novel natural cohort presented in the Hunger Games novel, as documented by author Suzanne Collins. I present a Hunger Games survival analysis: in a Cox proportional hazards model, which covariates are associated with the odds (or hazard ratios) being ever in your favor? A taste of what’s to come:"
statistics  fun  literature  sf  modeling  survival 
april 2012 by tsuomela
PLoS Computational Biology: Early Warning Signals for Critical Transitions: A Generalized Modeling Approach
Critical transitions are sudden, often irreversible, changes that can occur in a large variety of complex systems
complexity  transition  crisis  warnings  risk  modeling  signals 
february 2012 by tsuomela
Economics Debunked: Chapter Two for Sixth Graders « naked capitalism
"That means one way to figure out whether mainstream economics makes sense is to see what the assumptions are, and to try to decide whether those assumptions make sense. For example, what were Samuelson’s assumptions? What were Arrow and Debreu’s assumptions?

Samuelson had one big assumption, that economists call ergodicity.

[Teacher pauses to give kids time to stumble over the word.]

When they say ergodicity, they mean that no matter what happens in the world, in the end, everything will reach a point whether things stop changing. That point is called the “equilibrium.” At the equilibrium, everyone will end up with a certain amount of money. The amount of money that everybody gets at the equilibrium depends on how talented they are, and not on anything that happened before. So if you rob a bank, it won’t matter because when you get to the equilibrium, if you’re stupid, you will still have the same amount of money you would have had if you didn’t rob the bank."
economics  history  story  mathematics  modeling  scientism  sts 
september 2011 by tsuomela
Economics Debunked: Chapter Two for Sixth Graders « naked capitalism
"That means one way to figure out whether mainstream economics makes sense is to see what the assumptions are, and to try to decide whether those assumptions make sense. For example, what were Samuelson’s assumptions? What were Arrow and Debreu’s assumptions?

Samuelson had one big assumption, that economists call ergodicity.

[Teacher pauses to give kids time to stumble over the word.]

When they say ergodicity, they mean that no matter what happens in the world, in the end, everything will reach a point whether things stop changing. That point is called the “equilibrium.” At the equilibrium, everyone will end up with a certain amount of money. The amount of money that everybody gets at the equilibrium depends on how talented they are, and not on anything that happened before. So if you rob a bank, it won’t matter because when you get to the equilibrium, if you’re stupid, you will still have the same amount of money you would have had if you didn’t rob the bank."
economics  history  story  mathematics  modeling  scientism  sts 
september 2011 by tsuomela
The Tyranny of Scales - PhilSci-Archive
"This paper examines a fundamental problem in applied mathematics. How can one model the behavior of materials that display radically different, dominant behaviors at different length scales. Although we have good models for material behaviors at small and large scales, it is often hard to relate these scale-based models to one another."
philosophy  mathematics  modeling  scale  via:cshalizi 
july 2011 by tsuomela
Daniel Nettle's personal page
Author of Personality:what makes you the way you are... "I am a behavioural scientist interested in applying ideas from ecology and evolution to human behaviour. I have worked on such topics as cooperation, reproductive decisions, parenting and families, personality, and health. My research uses theoretical modelling, as well as behavioural data from several countries, especially the UK. I"
people  evolution  biology  behavior  human  modeling  psychology 
april 2011 by tsuomela
SpringerLink - Computational
"This paper develops the concepts and methods of a process we will call ldquoalignment of computational modelsrdquo or ldquodockingrdquo for short. Alignment is needed to determine whether two models can produce the same results, which in turn is the basis for critical experiments and for tests of whether one model can subsume another. We illustrate our concepts and methods using as a target a model of cultural transmission built by Axelrod. For comparison we use the Sugarscape model developed by Epstein and Axtell."
simulation  computer  agent-based-model  modeling  computational-science  organization 
march 2011 by tsuomela
Anthropogenic greenhouse gas contribution to flood risk in England and Wales in autumn 2000 : Nature : Nature Publishing Group
"Here we present a multi-step, physically based ‘probabilistic event attribution’ framework showing that it is very likely that global anthropogenic greenhouse gas emissions substantially increased the risk of flood occurrence in England and Wales in autumn 2000."
global-warming  climate  precipitation  meteorology  environment  modeling  observation 
february 2011 by tsuomela
Human contribution to more-intense precipitation extremes : Nature : Nature Publishing Group
"Here we show that human-induced increases in greenhouse gases have contributed to the observed intensification of heavy precipitation events found over approximately two-thirds of data-covered parts of Northern Hemisphere land areas. These results are based on a comparison of observed and multi-model simulated changes in extreme precipitation over the latter half of the twentieth century analysed with an optimal fingerprinting technique. Changes in extreme precipitation projected by models, and thus the impacts of future changes in extreme precipitation, may be underestimated because models seem to underestimate the observed increase in heavy precipitation with warming."
global-warming  climate  precipitation  meteorology  environment  modeling  observation 
february 2011 by tsuomela
How wise are crowds?
Fortunately, in a paper to be published in the Review of Economic Studies, researchers from MIT’s Departments of Economics and Electrical Engineering and Computer Science have demonstrated that, as networks of people grow larger, they’ll usually tend to converge on an accurate understanding of information distributed among them, even if individual members of the network can observe only their nearby neighbors. A few opinionated people with large audiences can slow that convergence, but in the long run, they’re unlikely to stop it.
collective-intelligence  crowdsourcing  modeling  game-theory  simulation  intelligence  wisdom  networks  collective 
november 2010 by tsuomela
Interview with Laurence Meyer :: 10.06.2010 :: Federal Reserve Bank of Cleveland
So I think we have two kinds of modeling traditions. First there is the classic tradition...This is the beginning of modern macro-econometric model building. That’s the kind of models that I would use, the kind of models that folks at the Board use.
There’s also another tradition that began to build up in the late seventies to early eighties—the real business cycle or neoclassical models. It’s what’s taught in graduate schools. It’s the only kind of paper that can be published in journals. It is called “modern macroeconomics.”
Those models are a diversion. They haven’t been helpful at all at understanding anything that would be relevant to a monetary policymaker or fiscal policymaker. So we’d better come back to, and begin with as our base, these classic macro-econometric models. We don’t need a revolution.
econometrics  economics  history  model  modeling 
october 2010 by tsuomela
CASOS: Home | CASOS
CASOS brings together computer science, dynamic network analysis and the empirical study of complex socio-technical systems. Computational and social network techniques are combined to develop a better understanding of the fundamental principles of organizing, coordinating, managing and destabilizing systems of intelligent adaptive agents (human and artificial) engaged in real tasks at the team, organizational or social level. Whether the research involves the development of metrics, theories, computer simulations, toolkits, or new data analysis techniques advances in computer science are combined with a deep understanding of the underlying cognitive, social, political, business and policy issues.
complexity  modeling  research  networks  social  analysis  network-analysis  simulation  sociology  agent-based-model  school(CarnegieMellon) 
september 2010 by tsuomela
Altruism can be explained by natural selection : Nature News
A two-part mathematical analysis1, published in Nature this week, overturns this tenet by showing that it is possible for eusocial behaviour to evolve through standard natural-selection processes.
altruism  evolution  cooperation  biology  modeling 
august 2010 by tsuomela
Trainset Ghetto - The Morning News
Boarded-up windows in abandoned brick buildings, grass growing from sidewalk cracks, rusty storefronts—the cycle of a city’s evolution and abandonment is familiar. Artist Peter Feigenbaum reimagines these ghettos in miniature, using components from toy train sets and more. “‘Trainset Ghetto’ is voyeurism more than it is hobbyism. It is the physical byproduct of teenage suburban daydreams and attempts to live vicariously through an alien post-urban 1980s landscape that was in no way part of my quotidian existence.”

Peter Feigenbaum is a Brooklyn-based installation artist and musician. He has shown work related to his “Trainset Ghetto” installation at a number of small New York galleries since 2007. He studied architecture at Yale University.
art  photography  modeling  gallery 
august 2010 by tsuomela
Econometric Measures of Systemic Risk in the Finance and Insurance Sectors
We propose several econometric measures of systemic risk to capture the interconnectedness among the monthly returns of hedge funds, banks, brokers, and insurance companies based on principal components analysis and Granger-causality tests. We find that all four sectors have become highly interrelated over the past decade, increasing the level of systemic risk in the finance and insurance industries. These measures can also identify and quantify financial crisis periods, and seem to contain predictive power for the current financial crisis. Our results suggest that hedge funds can provide early indications of market dislocation, and systemic risk arises from a complex and dynamic network of relationships among hedge funds, banks, insurance companies, and brokers.
economics  complexity  systems  risk  measurement  modeling  interconnection  social-networks  network-analysis  finance 
august 2010 by tsuomela
EconModel Home Page
Twenty-One of the most important models in economics (micro and macro!) are included in the Classic Economic Models collection of EconModel applications.
economics  model  modeling  education  reference  study-guide 
july 2010 by tsuomela
Open Agent Based Modeling Consortium | ... a node in the CoMSES network
The OpenABM Consortium is a group of researchers, educators, and professionals with a common goal - improving the way we develop, share, and utilize agent based models. We are currently developing a model archive to preserve and maintain the digital artifacts and source code comprising an agent based model.
agent-based-model  research  simulation  complexity  modeling  collaboration  model  professional-association  data 
june 2010 by tsuomela
Lab Experiments for the Study of Social-Ecological Systems -- Janssen et al. 328 (5978): 613 -- Science
Governance of social-ecological systems is a major policy problem of the contemporary era. Field studies of fisheries, forests, and pastoral and water resources have identified many variables that influence the outcomes of governance efforts. We introduce an experimental environment that involves spatial and temporal resource dynamics in order to capture these two critical variables identified in field research. Previous behavioral experiments of commons dilemmas have found that people are willing to engage in costly punishment, frequently generating increases in gross benefits, contrary to game-theoretical predictions based on a static pay-off function. Results in our experimental environment find that costly punishment is again used but lacks a gross positive effect on resource harvesting unless combined with communication. These findings illustrate the importance of careful generalization from the laboratory to the world of policy.
commons  cooperation  ecology  institutions  ostrom  elinor  economics  science  modeling  evolution  experimental  via:cshalizi 
june 2010 by tsuomela
[1005.4117] Random Numbers in Scientific Computing: An Introduction
Random numbers play a crucial role in science and industry. Many numerical methods require the use of random numbers, in particular the Monte Carlo method. Therefore it is of paramount importance to have efficient random number generators. The differences, advantages and disadvantages of true and pseudo random number generators are discussed with an emphasis on the intrinsic details of modern and fast pseudo random number generators.
randomness  random  mathematics  science  physics  algorithms  lecture  modeling  reference  simulation 
june 2010 by tsuomela
PLoS Computational Biology: Evolutionary Establishment of Moral and Double Moral Standards through Spatial Interactions
Situations where individuals have to contribute to joint efforts or share scarce resources are ubiquitous. Yet, without proper mechanisms to ensure cooperation, the evolutionary pressure to maximize individual success tends to create a tragedy of the commons (such as over-fishing or the destruction of our environment). This contribution addresses a number of related puzzles of human behavior with an evolutionary game theoretical approach as it has been successfully used to explain the behavior of other biological species many times, from bacteria to vertebrates. Our agent-based model distinguishes individuals applying four different behavioral strategies: non-cooperative individuals (“defectors”), cooperative individuals abstaining from punishment efforts (called “cooperators” or “second-order free-riders”), cooperators who punish non-cooperative behavior (“moralists”), and defectors, who punish other defectors despite being non-cooperative themselves (“immoralists”).
cooperation  modeling  agent-based-model  evolution  game-theory  computational-science  simulation  biology  open-access 
may 2010 by tsuomela
National Center for Computational Sciences
The National Center for Computational Sciences (NCCS) provides the most powerful computing resources in the world for open scientific research. It is one of the world’s premier science facilities—an unparalled research environment that supports dramatic advances in understanding how the physical world works and using that knowledge to address our most pressing national and international concerns.

The NCCS was founded in 1992 to advance the state of the art in high-performance computing by putting new generations of powerful parallel supercomputers into the hands of the scientists who can use them the most productively. It is a managed activity of the Advanced Scientific Computing Research program of the Department of Energy Office of Science (DOE-SC) and is located at the Oak Ridge National Laboratory.
computational-science  computer  infrastructure  computer-science  modeling  federal  lab  academic-lab  government 
april 2010 by tsuomela
DOE - National Energy Technology Laboratory: Home Page
The National Energy Technology Laboratory (NETL), part of DOE’s national laboratory system, is owned and operated by the U.S. Department of Energy (DOE). NETL supports DOE’s mission to advance the national, economic, and energy security of the United States.
energy  government  federal  lab  science  computational-science  modeling 
april 2010 by tsuomela
Open Science Grid Home page
OSG brings together computing and storage resources from campuses and research communities into a common, shared grid infrastructure over research networks via a common set of middleware.
research  science  modeling  computing  technology  computer  academia  community  grid  cluster  open-science  collaboration  project(Utenn) 
april 2010 by tsuomela
Homepage for Prof. C.H. Sequin
Carlo H. Sequin - links to a lot of sculptures fabricated based on mathematical shapes/forms.
people  mathematics  graphics  modeling  sculpture  art 
march 2010 by tsuomela
Technology Review: Blogs: arXiv blog: Best Connected Individuals Are Not The Most Influential Spreaders in Social Networks
By contrast, "a less connected person who is strategically placed in the core of the network will have a significant effect that leads to dissemination through a large fraction of the population."
networks  network-analysis  social-networking  social  modeling 
february 2010 by tsuomela
[0911.2390] How Creative Should Creators Be To Optimize the Evolution of Ideas? A Computational Model
There are both benefits and drawbacks to creativity. In a social group it is not necessary for all members to be creative to benefit from creativity; some merely imitate or enjoy the fruits of others' creative efforts. What proportion should be creative? This paper contains a very preliminary investigation of this question carried out using a computer model of cultural evolution referred to as EVOC (for EVOlution of Culture)....For all levels or creativity, the diversity of ideas in a population is positively correlated with the ratio of creative agents.
creativity  innovation  modeling  evolution  social-networks  analysis 
december 2009 by tsuomela
Open-source software for Operations Research and Industrial Engineering
This page contains links to some of the most useful free software and open-source software for operations research and industrial engineering.
open-source  software  operationsResearch  mathematics  linear-algebra  modeling 
november 2009 by tsuomela
Open Left:: The Crash: Who Saw It Coming--And Why
Dutch economist Dirk Bezemer, writing in the Financial Times on September 7, "Why some economists could see the crisis coming". What's more, he has a much more detailed explanation in a 51-page paper, "'No One Saw This Coming': Understanding Financial Crisis Through Accounting Models" (pdf). Long story short, Bezemer set out to find those who had been right in predicting the financial meltdown, not just randomly, but because of a well-reasoned argument. He found eight examples-including Baker-and analyzed what they had in common. He discovered that they all relied on accounting models that looked at the economy in terms of stocks and flows, in sharp contrast to the standard macro-economic models that actually have no way of predicting a financial crisis, since their programming does not allow for the possibility.
economics  prediction  recession  macroeconomic  methods  modeling  retrospective 
october 2009 by tsuomela
Lucas roundtable: Ask the right questions | Free exchange | Economist.com
But all the tools in the world are useless if we lack the imagination needed to build the right models. Models are built to answer specific questions.
economics  macroeconomic  recession  failure  model  questions  modeling 
august 2009 by tsuomela
The Failure of Macroeconomics « ThinkMarkets
it is not simply a matter of finding the right explanation of the recent financial meltdown and recession. The search by most macroeconomists is constrained by a certain set of unquestioned methodological precepts. These precepts go to the heart of the conception of Economics as a Science.
economics  macroeconomic  methodology  ideology  scientism  certainty  mathematics  modeling 
july 2009 by tsuomela
Beating The Radar: Getting A Jump On Storm Prediction
By running high-speed five-minute satellite scans through a carefully designed computer algorithm, the scientists can quickly analyze cloud top temperature changes to look for signs of storm formation.
weather  meteorology  satellite  remote-sensing  observation  computer  modeling  forecast 
june 2009 by tsuomela
SSRN-Networks in Finance by Franklin Allen, Ana Babus
Modern financial systems exhibit a high degree of interdependence. There are different possible sources of connections between financial institutions, stemming from both the asset and the liability side of their balance sheet. For instance, banks are directly connected through mutual exposures acquired on the interbank market. Likewise, holding similar portfolios or sharing the same mass of depositors creates indirect linkages between financial institutions. Broadly understood as a collection of nodes and links between nodes, networks can be a useful representation of financial systems.
networks  network-analysis  banking  finance  economics  modeling 
february 2009 by tsuomela
Angry Bear: Background on "fresh water" and "salt water" macroeconomics
fresh-water=chicago school free-marketers, salt-water=coastal schools that question models.
economics  story-telling  modeling  truth  rational-markets  markets  ideology 
january 2009 by tsuomela
Science News / Cooling Climate ‘consensus’ Of 1970s Never Was
When global warming skeptics draw misleading comparisons between scientists’ nascent understanding of climate processes in the 1970s and their level of knowledge today, “it’s absolute nonsense,” Schneider says. Back then, scientists were just beginning to study climate trends and their causes, and the probability of finding evidence to disprove a particular hypothesis was relatively high. Nowadays, he contends, “the likelihood of new evidence to overthrow the concept of global warming is small. Warming is virtually certain.”
environment  climate  global-warming  1970s  science  history  modeling  evidence  consensus 
december 2008 by tsuomela
How to Save the World
Business Risk, Prediction Markets, Sustainability, Resilience, And The Wisdom Of Crowds
business  risk  modeling  storytelling 
march 2008 by tsuomela
Why it is hard to share the wealth - fundamentals - 12 March 2005 - New Scientist
This, along with research data from other countries, suggests that there are two economic classes. In one, the rich grow richer while in the other the poor stay poor.
economics  econophysics  physics  modeling  wealth  distribution  via:pollard 
october 2007 by tsuomela
Good Math, Bad Math : Using Bad Math to Create Bad Models to Produce Bad Results
modeling evolution as a search of a fitness landscape. It's pretty common to model evolution that way but it is worth pointing out that while search is a useful model of evolution, it's far from a perfect o
math  evolution  simulation  modeling  commentary 
may 2007 by tsuomela
« earlier      
per page:    204080120160

Copy this bookmark:





to read