recentpopularlog in

pierredv : modeling   38

[pdf] The Nile River Records Revisited - How good were Joseph's predictions? Ghil_NileTalk.pdf
Ecole Normale Supérieure, Paris, and University of California, Los Angeles
The Nile River Records Revisited: How good were Joseph's predictions?
Michael Ghil
Egypt  modeling  statistics 
april 2019 by pierredv
Engineering Tools | AGI
"Systems Tool Kit (STK) is the premier software for providing four-dimensional modeling, simulation, and analysis of objects from land, sea, air, and space in order to evaluate system performance in real or simulated-time."
satellite  space  modeling  simulation  space-debris  tracking 
august 2017 by pierredv
World’s Most Powerful Emulator of Radio-Signal Traffic Opens for Business, April 2017
"New electronic simulation forum for wireless communication will host much of DARPA’s Spectrum Collaboration Challenge"

Paul Tilghman: “By contrast, SC2 is asking a group of radios that weren’t designed to work together to learn how to optimize spectrum capacity in real-time, and is relying on artificial intelligence to find and take advantage of ‘gaps’ and other opportunities to increase efficiency. You can’t satisfactorily learn how to solve this puzzle unless you address it at scale, and that’s why the Colosseum is such a critical part of the solution.”

"By the numbers, the Colosseum testbed is a 256-by-256-channel RF channel emulator, which means it can calculate and simulate in real-time more than 65,000 channel interactions among 256 wireless devices. Each simulated channel behaves as though it has a bandwidth (information content) of 100 MHz, which means the testbed supports 25.6 GHz of bandwidth in any instant. Moreover, each channel’s transmission and reception frequency is tunable between 10 MHz (as in broadcast FM radio) and 6 GHz (as in WiFi)."
DARPA  SC2  mirror-worlds  simulation  modeling 
august 2017 by pierredv
[pdf] The Financial Modelers' Manifesto - by Emanuel Derman and Paul Wilmott
originally written by Emanuel Derman and Paul Wilmott in the wake of the 2008 financial crisis:

∼ I will remember that I didn’t make the world, and it doesn’t satisfy my equations.

∼ Though I will use models boldly to estimate value, I will not be overly impressed by mathematics.

∼ I will never sacrifice reality for elegance without explaining why I have done so.

∼ Nor will I give the people who use my model false comfort about its accuracy. Instead, I will make explicit its assumptions and oversights.

∼ I understand that my work may have enormous effects on society and the economy, many of them beyond my comprehension.
modeling  * 
october 2016 by pierredv
EnergywiseEnergyThe Smarter Grid NASA to Test Upgraded Earth Models for Solar Storm Threat - IEEE Spectrum Mar 2016
"NASA relies on spacecraft to help keep watch for solar storms erupting on the sun. But on the ground, the space agency’s Solar Shield project aims to improve computer simulations that predict if those solar storms will create electrical hazards for power plants and transmission lines."
“One of the big new things coming out of this project in the next few months or so will be the inclusion of a full physics, 3-D treatment of the geomagnetic induction process that drives the main hazard from solar storms,” says Antti Pulkkinen, a heliophysicist at NASA’s Goddard Space Flight Center.
"The so-called Carrington Event of 1859 mostly took down telegraph systems and gave some telegraph operators a frightful shock. But a 2013 study by the Lloyd’s insurance market estimated that a similar event in modern times could cause up to US $2.6 trillion in damage for North America alone. In 2012, the Earth dodged a bullet when a huge solar flare that would have rivaled the Carrington Event missed striking our planet by just nine days."
IEEE-Spectrum  solar-storm  NOAA  NASA  modeling  Carrington-Event 
march 2016 by pierredv
Climate change means the flood defence rule book needs a rewrite - New Scientist - Opinion 16 Jan 2016
"The chaotic behaviour of weather systems makes it impossible to accurately predict rainfall, river flows and the like more than a few weeks in advance. However, hydrologists tend to assume that these variables fluctuate randomly in the long run, which means that their average value, or the probability of exceeding a given threshold, can be estimated accurately from lots of observations. What’s more, these results do not change over time – a property known as “stationarity”– and so form the basis of flood defence plans."

But some hydrologists are saying that climate change => stationarity is dead.
NewScientist  opinion  climate-change  modeling  statistics  weather 
march 2016 by pierredv
Consultants suggest new approach to spectrum demand forecasts - PolicyTracker Jul 2015
- "The term “spectrum crunch” has been successful as a “call for action,” prompting governments worldwide to free up spectrum for wireless uses, say Brian Williamson and Sam Wood, the authors of the report. “However, as a way of thinking about and modelling spectrum demand, the notion of a spectrum crunch may have outlived its usefulness.” Instead, the authors floated the idea of using a “bootstrap” model which balances network capacity cost against how much consumers are willing to pay for data"
PolicyTracker  Plum  spectrum-crunch  modeling  cellular 
july 2015 by pierredv
Radio Mobile - RF propagation simulation software - Welcome
"Radio Mobile is software by Roger Coudé VE2DBE. The program simulates RF propagation and is for free to the amateur radio community."
radio  propagation  simulation  modeling 
june 2015 by pierredv
facebook/augmented-traffic-control · GitHub
via Andreas Achtzehn, March 2013: "Facebook has today open-sourced a tool they use to simulate different mobile network connections, e.g. EDGE or congested 3G, to test user experience of their apps. Very interesting, could also be useful for future Wi-Fi congestion research, e.g. by allowing mapping user experience metrics to controlled network inhibitions. The tool runs on a standard Linux router, very nice."
github  networking  modeling  Facebook 
march 2015 by pierredv
Ofcom | BBC/Ofcom DAB In-home Reception Trial
"the project obtained DAB reception measurements within households located in a number of pre-selected coverage areas in order to determine how well the model’s predicted level of coverage matched measured levels of coverage. . . The findings of the two phases of this study confirm that == on average the Prediction Model gives a good indication of the level of DAB coverage and == DAB reception in ‘good’ DAB coverage areas is equivalent to good FM reception"
Ofcom  measurement  modeling  prediction  propagation  BBC  DAB 
january 2015 by pierredv
Model Universe recreates evolution of the cosmos : Nature News & Comment May 2014
Work by Mark Vogelsberger et al, MIT "Can current theories of cosmology explain how the Universe evolved? One way to find out is to plug everything we think we know about the early Universe and how galaxies form into a supercomputer, and see what comes out. In a simulation presented today in Nature1, researchers did just that — and revealed a cosmos that looks rather like our own. The findings lend weight to the standard model of cosmology, but could also help physicists to probe where our models of galaxy formation fall down."
cosmology  universe  modeling  simulation  mirror-worlds  NatureJournal 
may 2014 by pierredv
EPFD Software - Recommendation ITU-R S 1503
"Visualyse EPFD is a mature and fully tested implementation of the direct simulation algorithm contained in Recommendation ITU-R S.1503. .. Visualyse EPFD evaluates the compatibility of a non-GSO filing with the limits in Article 22 – assessed according to the methods of Recommendation ITU-R S.1503. For each filing there are multiple applicable EPFD limits depending on frequency band and victim antenna dish size. Visualyse EPFD automatically sets up simulations for each applicable limit. It calculates the worst case location for the victim station and undertakes a complex calculation for each case. The calculation is based on the dynamics of the non-GSO system and the geometry dependent EPFD masks that will be notified with the network. For detailed discussion of the algorithm you can download documents from the download section of this site"
spectrum  field-strength  ITU  modeling  simulation  Transfinite 
march 2014 by pierredv
CEPT.ORG - ECO - Tools and Services - SEAMCAT
SEAMCAT is a software tool based on the Monte-Carlo simulation method, which is developed within the frame of European Conference of Postal and Telecommunication administrations (CEPT). This tool permits statistical modelling of different radio interference scenarios for performing sharing and compatibility studies between radiocommunications systems (short range devices, GSM, UMTS, LTE, etc… ) in the same or adjacent frequency bands.
propagation  interference  modeling  CEPT  SEAMCAT  ECO 
november 2013 by pierredv
Human cycles: History as science : Nature News & Comment
"To Peter Turchin, who studies population dynamics at the University of Connecticut in Storrs, the appearance of three peaks of political instability at roughly 50-year intervals is not a coincidence. For the past 15 years, Turchin has been taking the mathematical techniques that once allowed him to track predator–prey cycles in forest ecosystems, and applying them to human history. He has analysed historical records on economic activity, demographic trends and outbursts of violence in the United States, and has come to the conclusion that a new wave of internal strife is already on its way" "In their analysis of long-term social trends, advocates of cliodynamics focus on four main variables: population numbers, social structure, state strength and political instability."
**  modeling  history  NatureJournal 
august 2012 by pierredv
Technology Review: Blogs: arXiv blog: The Evolution of Overconfidence
Work by Johnson & Fowler "if the potential reward is at least twice as great as the cost of competing, then overconfidence is the best strategy. In fact, overconfidence is actually advantageous on average, because it boosts ambition, resolve, morale, and persistence. In other words, overconfidence is the best way to maximize benefits over costs when risks are uncertain" "Their model implies that optimal overconfidence increases with the magnitude of uncertainty. So the greater the risk, the more overconfident individuals should become. "Johnson and Fowler use that finding to predict that overconfidence will be particularly prevalent in domains where the perceived value of a prize sufficiently exceeds the expected costs of competing. "
over-confidence  modeling  psychology  research  evolution 
october 2011 by pierredv
The evolution of overconfidence : Nature Sep 2011
Paper at, description at "we present an evolutionary model showing that, counterintuitively, overconfidence maximizes individual fitness and populations tend to become overconfident, as long as benefits from contested resources are sufficiently large compared with the cost of competition"
research  modeling  evolution  psychology  overconfidence  NatureJournal 
september 2011 by pierredv
Building Rome in a Day - Sameer Agarwal
reconstructing entire cities from images harvested from the web via NewScientist
photography  architecture  modeling  visualization  city  geography  NewScientist 
september 2009 by pierredv
New Computer Modeling Program Can Help Hospitals Prepare For The Worst
"A new and novel computer modeling platform developed through intensive, multidisciplinary collaboration at New York University can help hospitals and cities to be more prepared for catastrophic public health scenarios"
agentbasedmodeling  modeling  healthcare  medicine 
june 2009 by pierredv
Medical News: New Computer Model Improves Catastrophe Preparedness - in Emergency Medicine, Emergency Medicine from MedPage Today
Planning with Large Agent Networks against Catastrophes, or PLAN C, is an "agent-based modeling approach" that allows researchers to assess both individual and system-wide effects in public health disasters
healthcare  complexity  agentbasedmodeling  policy  modeling 
june 2009 by pierredv
Simulating a public health disaster using multiple variables can assist hospitals and cities in preparing for worst-case scenarios - News
"Plan C uses a powerful, large-scale computational, multi-agent based disaster simulation framework involving as many as thousands of variables or agents – from existing hospital beds and emergency department services to hospital surge capacity and behavioural and psychosocial characteristics – to anticipate public response to an attack. It has been able to simulate the complex dynamics of emergency responses in such scenarios as a chemical release, food poisoning, and smallpox."
complexity  policy  agentbasedmodeling  modeling  healthcare 
june 2009 by pierredv
NetLogo is a cross-platform multi-agent programmable modeling environment
via computationallegalstudies blog, katz's syllabus for ABM in law
simulation  complexity  modeling  visualization  programming 
june 2009 by pierredv
CostQuest Associates
via Paul Garnett
analysing broadband costs: "our mission is to provide information, support, tools, techniques, and analysis of your costs and profitability"
broadband  modeling  commerce 
may 2009 by pierredv
Decision and Information Sciences - Electricity Market Complex Adaptive System (EMCAS)
"EMCAS is increasingly used to study restructuring issues in the U.S., Europe, and Asia. Clients include regulatory institutions interested in market design and consumer impact issues, transmission companies and market operators studying system and market performance, and generation companies analyzing strategic company issues. In its first application, CEEESA staff members used the software to simulate the Illinois and Midwest power markets. This was done for the Illinois Commerce Commission which needed advice on whether the existing transmission system could support a competitive market. Other model implementations are conducted for clients in Europe and Asia."
electricity  governance  modeling  computationalpublicpolicy 
april 2009 by pierredv
Why economic theory is out of whack - science-in-society - 19 July 2008 - New Scientist
Review of current information theory/economics. Simulations - flocking "Physicist Jean-Philippe Bouchaud and colleagues at Capital Fund Management in Paris studied the news feeds produced by Dow Jones and Reuters that provide real-time reports of items of potential interest to investors. Looking at more than 90,000 news items relevant to hundreds of stocks over a two-year period, they studied how "jumps" in stock prices - sudden, large movements - were linked to news items. They weren't. Most such jumps weren't directly associated with any news at all, and most news items didn't cause any jumps. "Jumps seem to occur for no identifiable reason," Bouchaud says (" Modeling of why prices fluctuate so much - "public and private information tends to keep prices around realistic values, as the classical equilibrium model says it should. [But social feedback] creates groups of people coordinated in their actions, which in turn leads to bubbles
economics  finance  complexity  simulation  modeling  **  NewScientist 
december 2008 by pierredv
NTIA - Microcomputer Spectrum Analysis Models (MSAM)
installable Longley-Rice (aka Irregular Terrain Model) calculator
NTIA  wireless  spectrum  modeling  propagation  rf  engineering 
january 2008 by pierredv

Copy this bookmark:

to read