NEW ANALYTICAL TOOLS AND TECHNIQUES FOR

Transcription

NEW APPROACHES TO ECONOMIC CHALLENGES (NAEC)NEW ANALYTICAL TOOLS ANDTECHNIQUES FOR ECONOMICPOLICYMAKINGOECD-NAEC and Baillie GiffordIn association with Partners for a New Economy (P4NE); Rebuilding Macroeconomics; Institute forNew Economic Thinking (INET) Oxford; European Commission Joint Research Centre; InternationalInstitute for Applied Systems Analysis (IIASA); Fields Institute, Complex System Institute of Paris idF;the Santa Fe Institute; and Capital Fund Management (CFM)Abstracts and Bios15-16 April 2019OECD Conference Centre, Paris, CC4Further information: William Hynes – william.hynes@oecd.org

NEW APPROACHES TO ECONOMIC CHALLENGES (NAEC)NEW ANALYTICAL TOOLS AND TECHNIQUES FOR ECONOMIC POLICYMAKINGUnderstanding of economic issues such as growth, financial crises, systemic risk, innovation and sustainability canbenefit from the revolution taking place across a range of scientific disciplines and in the social sciences. Thisrevolution is being driven by the interaction between technological progress in computing and communications andthe new sources and greater quantities of data this makes available.This NAEC conference offers a timely opportunity for policy-makers, academics and researchers in economics todiscuss the state-of-the-art policy applications emerging from the new analytical tools and techniques. It will look athow methodological innovations and inter disciplinary approaches such as agent-based modelling, nowcasting,machine learning, and network analysis could contribute to better understanding of the complexity and interactionof our economic, financial, social and environmental systems.Monday 15 April9:30 - 10:00Opening remarks:Angel Gurria, OECD Secretary-General (video)Gabriela Ramos, OECD Chief of Staff and SherpaLaurence Boone, OECD Chief EconomistMartine Durand, OECD Chief Statistician10:00 – 11:00Session 1: Why Do We Need New Analytical Tools and Techniques?Moderator: Gabriela Ramos, OECD Chief of Staff and SherpaSpeakers: J. Doyne Farmer, Director of Complexity Economics, Institute for New EconomicThinking, and Santa Fe Institute, and Rebuilding MacroeconomicsJ. Doyne Farmer is Director of the Complexity Economics programme at the Institute for New EconomicThinking at the Oxford Martin School, and the Baillie Gifford Professor at Mathematical Institute at theUniversity of Oxford, as well as an External Professor at the Santa Fe Institute.His current research is in economics, including agent-based modeling, financial instability andtechnological progress. He was a founder of Prediction Company, a quantitative automated trading firmthat was sold to the United Bank of Switzerland in 2006. His past research includes complex systems,dynamical systems theory, time series analysis and theoretical biology.During the eighties he was an Oppenheimer Fellow and the founder of the Complex Systems Group atLos Alamos National Laboratory. While a graduate student in the 70’s he build the first wearable digitalcomputer, which was successfully used to predict the game of roulette. Robert Axtell, Chair of the Department of Computational Social Science atGeorge Mason University, and Santa Fe InstituteRobert Axtell is an Associate Professor of the Santa Fe Institute who works at the intersection ofeconomics, behavioral game theory, and multi-agent systems computer science. His most recentresearch attempts to emerge a macroeconomy from tens of millions of interacting agents. He isDepartment Chair of the new Department of Computational Social Science at George Mason University(Fairfax, Virginia, USA). He teaches courses on agent-based modeling, mathematical modeling, andgame theory. His research has been published in "Science," "Proceedings of the National Academy ofSciences USA," and leading field journals. Popular accounts have appeared in newspapers, magazines,books, online, on the radio and in museums. His is the developer of Sugarscape, an early attempt to dosocial science with multi-agent systems, andco-author of "Growing Artificial Societies: Social Sciencefrom the Bottom Up" (MIT Press 1996). Previously, he was a Senior Fellow at the Brookings Institution(Washington, D.C. USA) and a founding member of the Center on Social and Economic Dynamics there.He holds an interdisciplinary Ph.D. from Carnegie Mellon University (Pittsburgh, USA).1

NEW APPROACHES TO ECONOMIC CHALLENGES (NAEC) Jean-Philippe Bouchaud, Chairman, Capital Fund Management (CFM) andRebuilding MacroeconomicsJean-Philippe Bouchaud is Chairman and Chief Scientist. He supervises the research and maintainsstrong links between the research team and the academic world. He is also a professor at EcolePolytechnique where he teaches Statistical Mechanics and a course on 'Complex Systems'. He joinedCFM in 1994. William H. Janeway, Faculty of Economics, Cambridge University and Advisor,Warburg PincusWilliam H. Janeway has lived a double life of “theorist-practitioner,” according to the legendary economistHyman Minsky, who first applied that term to him twenty-five years ago. In his role as “practitioner,” BillJaneway has been an active growth equity investor for more than 40 years. He is a senior advisor atWarburg Pincus, where he has been responsible for building the information technology investmentpractice, as well as a director of Magnet Systems and O'Reilly Media. As a “theorist," he is an affiliatedmember of the Faculty of Economics of Cambridge University, a member of the board of directors of theSocial Science Research Council and the Fields Institute for Research in the Mathematical Sciences,and of the Advisory Board of the Princeton Bendheim Center for Finance. He is a co-founder and memberof the Governing Board of the Institute for New Economic Thinking (INET), and a member of the Boardof Managers of the Cambridge Endowment for Research in Finance (CERF). Following publication inNovember 2012, his book Doing Capitalism in the Innovation Economy: Markets Speculation and theState (Cambridge University Press) became a classic. The fully revised and updated secondedition, Doing Capitalism in the Innovation Economy: Reconfiguring the Three-Player Game betweenMarkets, Speculators and the State was published in May 2018. Michael Jacobs, Professorial Research Fellow, Sheffield Political EconomyResearch Institute (SPERI)Michael Jacobs is a Professorial Fellow and Head of Engagement and Impact. He is an economist andpolitical theorist, specialising in post-neoliberal political economy, climate change and environmentalpolicy, and green and social democratic thought. He is responsible for oversight and leadership withrespect to SPERI’s engagement and impact work.Michael leads SPERI’s Corporate Power & the Global Economy research theme with Merve Sancak.Prior to joining SPERI Michael was Director of the IPPR Commission on Economic Justice, based at theUK think tank the Institute for Public Policy Research. He was principal author and editor of theCommission’s final report Prosperity and Justice: A Plan for the New Economy (2018).Originally a community worker and adult educator, Michael later became a director and then managingdirector of CAG Consultants, where he worked in local economic development and sustainabledevelopment. He was subsequently an ESRC research fellow at Lancaster University and the LSE. Hewas General Secretary of the think tank and political association the Fabian Society from 1997-2003.From 2004–2007 Michael was a member of the Council of Economic Advisers at the UK Treasury, andfrom 2007–2010 he was a Special Adviser to Prime Minister Gordon Brown, with responsibility for energy,environment and climate policy.After leaving government in 2010, Michael advised governments and others on international climatechange policy in the run-up to the UN Climate Conference in Paris in December 2015. He was a founderand senior adviser to the Global Commission on the Economy and Climate.2

NEW APPROACHES TO ECONOMIC CHALLENGES (NAEC)11:15 – 12:30Session 2 : NowcastingModerator: Lucrezia Reichlin, Professor of Economics, London Business SchoolSpeakers: Laurent Ferrara, Head of International Macro Division, Banque de FranceWhen are Google data useful to nowcast GDP? An approach via pre-selection andshrinkageNowcasting GDP growth is extremely useful for policy-makers to assess macroeconomic conditions inreal-time. In this paper, we aim at nowcasting euro area GDP with a large database of Google searchdata. Our objective is to check whether this specific type of information can be useful to increase GDPnowcasting accuracy, and when, once we control for official variables. In this respect, we estimate shrunkbridge regressions that integrate Google data optimally screened through a targeting method, and weempirically show that this approach provides some gain in pseudo-real-time nowcasting of euro areaGDP quarterly growth. Especially, we get that Google data bring useful information for GDP nowcastingfor the four first weeks of the quarter when macroeconomic information is lacking. However, as soon asofficial data become available, their relative nowcasting power vanishes. In addition, a true real-timeanalysis confirms that Google data constitute a reliable alternative when official data are lacking.When are Google data useful to nowcast GDP? An approach via pre-selection and shrinkageLaurent Ferrara is Head of the International Macroeconomics Division at the Banque de France in Paris,in charge of the outlook and macroeconomic forecasting for advanced economies, as well as globalpolicy issues such as exchange rates, commodities or global imbalances. Main tasks of this division ofaround 20 people are policy briefing, preparation of international meetings (ECB, IMF, OECD, G20) andeconomic research. He is also involved in academics and has been appointed Adjunct Professor ofEconomics at the University of Paris Nanterre in September 2011.Laurent Ferrara is Director of the International Institute of Forecasters, an international associationaiming at bridging the gap between theory and applications in forecasting, through the organisation ofworkshops and conferences and the publication of an academic journal, the International Journal ofForecasting. He is also an associate editor of this journal.Dr. Ferrara holds a PhD in Applied Mathematics from the University of Paris North (2001) and a ResearchHabilitation in Economics from the University of Paris 1 – Panthéon – Sorbonne (2007). His academicresearch mainly focuses on macroeconomic forecasting, international economics, econometric methods,non-linear modelling and business cycle analysis. He published more than 50 papers in international andnational academic journals, chapters in books, as well as book on time series analysis and forecasting. Elias Albagli, Chief Economist, Central Bank of ChileReal-time VAT data in Chile: Applications for Monetary and Financial PolicyThe Central Bank of Chile has been developing new administrative data sources for advancing itsanalytical and forecasting tools. Since 2016, the tax administration authority (SII) mandates transactionbetween firms to be electronically transmitted in real time for VAT accounting purposes. This wealth ofdata allows several important advances, both for projection and macroeconomic analysis purposes, aswell as for structural economic research. We highlight three applications. First, real-time transactionsallow computing value-added proxies for several sectors, enhancing nowcasting capacity and eventuallydiminishing the lag between economic activity data and Monetary Policy decisions by about a month.Second, the complete network structure also allows to better interpret linkages between supply anddemand side of national accounts in real time. For instance, an expansion of wholesale machine andequipment intermediation sector can be linked precisely with the end-user sectors which are increasingfixed investment. This knowledge is of particular interest for a commodity exporting country, where mininginvestment has different lags and spillovers on overall economic activity as other sectors. Third, mergingthis data with credit information (also available for the universe of Chilean firms) is of potential use indetecting macroeconomic and financial stability risks. Indeed, the network structure in real-time can beused to detect disruptions (i.e., imminent firm closure), assess their spillovers to interconnected firms(customer and supplier), and predict the overall macroeconomic impact across different sectors as wellas their implications for debt service capacity of affected firms.3

NEW APPROACHES TO ECONOMIC CHALLENGES (NAEC)Elías Albagli is the Director of Central Bank of Chile s Monetary Policy Division, since August2018. Previously, he was Director of Central Bank of Chile s Research Division since June 2018.Previously he was Manager of Modeling and Economic Analysis of the Bank, (December 2014 throughJune 2018). He holds a Bachelor’s degree in Business and a Master’s in Financial Economics from theCatholic University of Chile (2002), where he received the best graduating student award. He receivedhis Ph.D. in Economics from Harvard University in 2010. Previously he worked in the Central Bank’sEconomic Research Division, most recently as a senior economist (June 2013 to November 2014) andalso as an economic analyst (2002–2005). He was an assistant professor of Economics and Finance atthe University of Southern California from 2010 to 2013. Mr. Albagli has taught courses on FinancialMarkets and Macroeconomics at different institutions, including the Economics Department at theCatholic University of Chile and the Economics and Business Management Department at the Universityof Chile. He has published numerous journal articles, book chapters and working papers on issuesrelated to macroeconomics and financial markets.14:00 - 16:00Session 3 : Agent Based ModellingModerator: J. Doyne Farmer, Director of Complexity Economics, Institute for NewEconomic Thinking, University of Oxford, and Rebuilding MacroeconomicsSpeakers: Alissa Kleinnijenhuis, Researcher, Mathematical Institute, Oxford, and Institutefor New Economic Thinking, University of Oxford, and Thom Wetzer, DPhilcandidate in Law and Finance, University of OxfordFoundations of System-Wide Stress TestingMicroprudential stress tests have been credited for restoring confidence in the banking sys-tem andallowing for a successful recapitalisation of banks (Bernanke (2013)). They have gained enormousimportance in the post-crisis regulatory toolkit. Their core goal is to assess systemic risk. Despite theirvictories, microprudential stress tests lack interconnections, and thereby their ability to considerendogenously-amplified systemic risk. This fundamental deficiency impairs their ability to assessresilience, which has led various academics and regulators to call for system-wide stress tests (e.g.Brazier (2017)). Yet, no generic method exists yet (Anderson et al. (2018)). Challenges include(Anderson et al. (2018)): to capture systemic risk amplification mechanisms comprehensively andconsistently; to encapsulate the behavioural responses to shocks; to encompass interactions betweenconstraints and behaviour; to con- sistently incorporate non-banks; to reect the heterogeneity ofobjectives, resources and constraints (Danielsson and Shin (2003)); to exibly adjust to a changingfinancial system; and to deal with a lack of suffciently granular and well-covered data.In this paper, we propose a novel method for system-wide stress testing { that is, to our knowledge, thefirst to jointly tackle these challenges. It consists of five building blocks: institutions, contracts, constraints,markets, and behaviour. Together, these allow to track contagious dynamics in a multiplex network andassess fragility under various policy set-ups. We illustrate the power of this method by providing animplementation of the building blocks. We show that systemic risk may be significantly underestimatedif microprudential stress tests are not supplemented with a macroprudential overlay'. Based on the tool'sfoundations, credible stress system-wide stress tests can be built to crown the macroprudential toolkit.Alissa Kleinnijenhuis is a D.Phil Candidate in Financial and Computational Mathematics at theUniversity of Oxford and the Institute for New Economic Thinking at the Oxford Martin School. She isalso affiliated with the Oxford-Man Institute of Quantitative Finance. She works under supervision ofProfessor J. Doyne Farmer. Her research focuses on system-wide stress testing and systemic risk.Alissa acts as a visiting academic to the Bank of England (London), and has also conducted researchon stress-testing at the European Central Bank (Frankfurt). In addition, her professional experienceincludes work for Morgan Stanley (London) and Rogge Global Partners (London). Alissa holds a B.A.(Hons) in Mathematics and Economics from University College Utrecht (partially completed at UC SantaBarbara) and an M.Sc. in Mathematics and Finance from Imperial College London.Thom Wetzer is a DPhil Candidate in Law and Finance at the Oxford Faculty of Law, the Oxford-ManInstitute for Quantitative Finance and the Institute for New Economic Thinking at the Oxford MartinSchool. His research examines incentive misalignments and the role of law in mitigating them, with aparticular focus on systematic misalignments that generate systemic risk in financial systems. He alsoworks on climate risk in the context of the post-carbon transition.4

NEW APPROACHES TO ECONOMIC CHALLENGES (NAEC)Thom has been a visiting scholar at Columbia Law School, the Berkeley School of Law, and YaleUniversity, and is currently a visiting academic at the Bank of England and an academic consultant atthe European Central Bank. He has worked at the European Commission, Goldman Sachs, and DeBrauw Blackstone Westbroek, and is a ‘Global Shaper’ at the World Economic Forum. Stanislao Gualdi, Research Fellow, Capital Fund ManagementOptimal Inflation Target: Insights from an Agent-Based ModelWhich level of ination should Central Banks be targeting? We investigate this issue in the context of asimplified Agent Based Model of the economy. Depending on the value of the parameters that describethe behaviour of agents (in particular ination anticipations), we find a rich variety of behaviour at themacro-level. Without any active monetary policy, our ABM economy can be in a high ination/high outputstate, or in a low ination/low output state. Hyper-ination, deation and \business cycles" betweencoexisting states are also found. We then introduce a Central Bank with a Taylor rule-based inationtarget, and study the resulting aggregate variables. Our main result is that too-low ination targets are ingeneral detrimental to a CB-monitored economy. One symptom is a persistent under-realisation ofination, perhaps similar to the current macroeconomic situation. Higher ination targets are found toimprove both unemployment and negative interest rate episodes. Our results are compared with thepredictions of the standard DSGE model.Stanislao Gualdi is a Research Fellow at Capital Fund Management. He has a PhD in TheoreticalPhysics from the University of Fribourg and a Master's degree in Theoretical Physics from SapienzaUniversità di Roma. He was a Postdoctoral Researcher at Ecole Centrale Paris. Torsten Heinrich, Researcher, Institute for New Economic Thinking, University ofOxfordAn ABM of the insurance-reinsurance sector: Conclusions for systemic risk, marketstructure, and the insurance cycleRisk models are employed in the insurance and reinsurance industry to assess the probability and sizeof risk events. Under the new Solvency II regulations the choice of models that can be used in theinsurance sector has become severely limited. This creates a danger in the sense that all insurancecompanies may rise and fall in tandem, making the sector brittle and creating a public welfare problem.We present here a novel agent-based model of the catastrophe insurance and reinsurance sectors tostudy this constraint. More than other branches of the insurance industry, catastrophe insurance issubject to heavy-tailed distributions, which occur for damage size, peril frequency, claims, losses, andbankruptcy events. As a consequence, agent heterogeneity, interaction and stochastic influences are ofcrucial importance. Characterising the system requires studying ensembles of counterfactual cases andconsidering not only the mean but also the dispersion of realisations. Agent-based modeling is wellsuited to fulfill these requirements. We discuss the properties of the model; We substantiate the validationof modeling decisions with economy-level, firm-level, and contract level data; We explain micro- andmacro-calibration of the model; And we show some selected results: 1. The reproduction of the insurancecycle. 2. The frequency and distribution of bankruptcies across different settings with different levels ofrisk model homogeneity. This aspect is directly connected to systemic risk in insurance. 3. Othersystematic effects of risk model homogeneity. 3. The sensitivity of the model with respect to parameterssuch as the rate of market entry, the interest rate, and the capital retention requirements. 4. The resultingmarket structures in terms of firm sizes and relative shares of insurance and reinsurance business. 5. Amarket with several operational risk models is a lot more profitable, more competitive and has a highercapacity than a market with only one risk model.Torsten Heinrich is a researcher at the Institute for New Economic Thinking (INET) at the Oxford MartinSchool of the University of Oxford. His work is concentrated methodologically in the fields of agent-basedmodelling, game theory, complexity economics and evolutionary economics, but also empirical work onindustrial organisation and technological change among other areas, with an interest in newtechnologies, in the potential they create, and in economic, social and political consequences theirimplementation could entail. He studied economics at the Dresden University of Technology (Dresden,Germany) and the Universidad Autónoma de Madrid (Madrid, Spain) graduating in 2007. He receivedhis PhD from the University of Bremen, Germany, in 2011 with a thesis on technological change andgrowth patterns in the presence of network effects. Working on complexity systems, agent-basedmodeling, simulation and strategic games in economics, he has edited special issues in scientific journalsand authored both journal articles and monographs. He holds a post-doc position at the Institute for NewEconomic Thinking (INET) at the University of Oxford, UK, and teaches at the University of Bremen,Germany.5

NEW APPROACHES TO ECONOMIC CHALLENGES (NAEC) Francois Lafond, Senior Research Officer, Institute for New Economic Thinking,University of OxfordAutomation and bottlenecks in occupational mobility: a data-driven network modelMany existing jobs are prone to automation, raising important concerns about the future of employment.However, history suggests that alongside automation, different jobs are created so that it is crucial tounderstand job transitions. To do so, we impose an automation shock on a network-based labour marketmodel. We construct an occupational mobility network, where nodes are occupations and edges linkoccupations that are similar enough for a worker to transition from one to the other. We then model thedynamics of employment, unemployment, and vacancies at the occupation level, based on exogenous(automation-related) reallocation of labour demand, separations and vacancies opening rates, jobsearch, and matching. After discussing the model’s calibration and its ability to reproduce the Beveridgecurve, we study occupation-specific unemployment and long-term unemployment (27 or more weeks).As expected, in highly automated occupations, workers are more likely to be unemployed or to stayunemployed for a long period. However, the network structure also plays an important role - workers inoccupations with a similar degree of automation can have fairly different outcomes, depending on theposition of the occupation in the mobility network. Automation may cause ‘bottlenecks’ in the mobilitynetwork, with workers unable to find jobs for long periods. Our work highlights that retraining schemesmust not necessarily be directed towards workers in occupations with high risk of automation, but towardsworkers with limited transition possibilities.François Lafond is a senior research officer at the Institute for New Economic Thinking, at the OxfordMartin School Programme on Technological and Economic Change, at the Smith School for Enterpriseand the Environment, and an associate member of Nuffield college. He received his PhD from UNUMERIT / Maastricht University. His main areas of research are in the economics of innovation,environmental economics, networks and complex systems, applied econometrics and forecasting. Christoph Siebenbrunner, DPhil student, Mathematical Institute, University ofOxfordMoney creation and liquid funding needsStarting from a conceptual discussion about money creation as opposed to loanable funds and moneymultiplier theories, we categorise different forms of lending; broadly speaking, bank-lending versus nonbank lending, whereby we equate the former with money creation and the latter with loanable fundstheories. At this point, we provide a defnition of shadow banking which we link to all lending that is notbank lending and, hence, while increasing leverage for individual borrowers, does not increase systemwide money stocks. The purpose is then to put forward two concluding thoughts: First, the notion ofmoney creation as a result of banks' loan creation for the private sector is compatible with the notion ofliquid funding needs in a multi-bank system in which liquid fund transfers across banks happen naturally.Second, conventional interest rate-based monetary policy has a bearing on macroeconomic dynamicsprecisely due to that multi-bank structure. It would lose its impact in the hypothetical case that only one(\singular") commercial bank would exist. To illustrate the latter two points, we develop a simple agentbased model, with a focus on the bank loan creation (and destruction due to repayment) process, asopposed to pure intermediary lending through capital markets, banks' proprietary trading, and all othernon-bank financial and non-financial institution types' channeling of funds within the system. The modelcomprises bank agents (money creators), non-bank intermediaries (pure \channelers"), a central bank(liquid base money provider),and private sector agents that borrow from banks or non-banks to financetheir consumption.Christoph Siebenbrunner is a doctoral student in Mathematics and a member of the ComplexityEconomics research group under the supervision of Prof. Doyne Farmer at Oxford University. Hisresearch focusses on modelling systemic risk and financial systems in general, using a wide array ofmethodologies including statistical modelling, network analysis and agent-based modelling. He hasseven years of professional experience working as a stress testing expert and quantitative modeller forcentral banks including the ECB and the Austrian National Bank.Discussant:Robert Axtell, Chair of the Department of Computational Social Science at GeorgeMason University, and Santa Fe Institute6

NEW APPROACHES TO ECONOMIC CHALLENGES (NAEC)Robert Axtell is an Associate Professor of the Santa Fe Institute who works at the intersection ofeconomics, behavioral game theory, and multi-agent systems computer science. His most recentresearch attempts to emerge a macroeconomy from tens of millions of interacting agents. He isDepartment Chair of the new Department of Computational Social Science at George Mason University(Fairfax, Virginia, USA). He teaches courses on agent-based modeling, mathematical modeling, andgame theory. His research has been published in "Science," "Proceedings of the National Academy ofSciences USA," and leading field journals. Popular accounts have appeared in newspapers, magazines,books, online, on the radio and in museums. His is the developer of Sugarscape, an early attempt to dosocial science with multi-agent systems, andco-author of "Growing Artificial Societies: Social Sciencefrom the Bottom Up" (MIT Press 1996). Previously, he was a Senior Fellow at the Brookings Institution(Washington, D.C. USA) and a founding member of the Center on Social and Economic Dynamics there.He holds an interdisciplinary Ph.D. from Carnegie Mellon University (Pittsburgh, USA).16:30 – 18:00Session 4 : Network AnalysisModerator: David Chavalarias, Director of the Complex System Institute of Paris idFSpeakers: Rajan Patel, Technical Specialist, Bank of EnglandTextual complexity in bank regulationReforms following the financial crisis of 2007-08 have increased the volume of bank regulation and ledto concerns about increased complexity. But there are few empirical measures of regulatory complexity,beyond simple page counts. To measure precisely the change in regulatory complexity, we define it asa property of how standards are articulated in regulatory texts, and calculate linguistic and structuralindicators that reflect the cognitive costs required to process texts. We extract these measures from adataset that covers the near-universe of prudential rules for banks in the United Kingdom, including EUdirectives and supervisory guidance, in 2007 and 2017. To understand the drivers of complexity, wecompare different regulatory tools (capital, liquidity, remuneration), and the relative contribution ofinternational versus national rules. We also compare the complexity of rules that apply to small versuslarge institutions, and benchmark UK with US regulation. Network maps that visualise textual crossreferences show that there are many peripheral rules and a few ver

the new sources and greater quantities of data this makes available. This . NAEC conference offers a timely opportunity for policy-makers, academics and researchers in economics to discuss the