PATRICK H. O'CALLAGHAN
  • Home
  • Research Summary
  • Papers
  • Photos
  • Local walks/runs
  • cv

The link Papers will bring you to a list publications and working papers.

Overview
My main research interest is the study of behaviour and decision making from a mathematical viewpoint although I am currently in the process of diversifying my research portfolio to include applications to finance, taxation and technology (notably blockchain). My work on decision theory has applications to both econometrics and finance. 

Multi-regional Input–Output and Computable General Equilibrium Modelling
Since joining the Australian Institute of Business and Economics, I have been working on Input–Output and Computable General Equilibrium Modelling in a multi-regional setting. There are three main strands to this literature: traditional IO, CGE modelling and embedded IO (in a macro model). All of these methods place production networks at the heart of the economy. Although groundbreaking in its time, traditional IO is based on the simple idea of solving the equation Ax + f = x where A is an adjacency matrix that represents flows of income from one sector to another (see heat map below); x is endogenous economic output; and f is exogenous final demand.  CGE is typically based on the idea of solving a representative agent's utility as a function of final demand subject to the constraint that every market in the production economy is in equilibrium. The modern literature on embedded IO models takes a macroeconomic model (for instance a Growth model with endogenous growth and embeds the above IO framework within the model to capture sectoral effects. 
To date, I have been involved in three projects, two of which are complete and relate to the Queensland economy: the Aluminium sector and the impact of COVID-19 on exports. The third is a broader project to build a new computational framework that is flexible enough to handle embedded IO and CGE modelling.
Picture
EventsA particularly simple event E contains all continuous paths in ℝ that begin with value a at time zero and pass through the two intervals [b,c], [b,a] and the point b at times t1, t2 and t3 respectively. The figure presents two possible trajectories. Only the lower one (in blue) belongs to E. The other (in red) does not pass through [b,a] at time t2 or the point b at time t3. Far more complex events are needed to elicit beliefs.
Decision making and prediction
Much of my research  focuses on that which lies at the heart of all economic activity: decision-making. I strive to ensure that the model we attribute to economic agents is robust to the  background structure we attribute to the model of the decision maker. 
For instance, in the standard model of expected utility,  an  agent's preferences are defined on a convex set of lotteries or prospects. 
Yet it is fair to say that the typical subject in an experiment is unaware of the convex set the experimenter has in mind. 

In my current working paper ``Prudent inductive inference with limited data'', I generalise the model of Gilboa and Schmeidler (Econometrica, 2003) to allow for forecasters that have limited data or that lack experience.  To compensate for this structural weakening, I  require that the decision maker is prudent enough and open-minded enough to draft her current forecast with the potential arrival of new information in mind. 

In ``Axioms for measuring utility on partial mixture sets'' (published in Journal of Mathematical economics), I provide a  framework that encompasses a number of  benchmark  models of uncertainty and ambiguity. I show that relaxing standard conditions on the set of prospects requires a compensatory strengthening of the axioms on preferences: if the usual cardinal linear utility is to exist. For different setting, in my paper ``Parametric continuity of utility when the topology is coarse'' (single-authored publication in Journal of Mathematical economics), a similar tradeoff arises in relation to the set of parameters that index utility. 



Taxation and industrial organisation
I have a working paper on ``Optimal taxation in networks with informal production''. Like bounded rationality, informal production introduces bounds on the ability of the government to tax firms. This work is of particular relevance to developing economies where the informal sector is large and where it may be efficient to let small firms and transactions slip through the tax net. I show that production efficiency  (as in the seminal work of  Diamond and Mirrlees, 1971) is feasible, but only if the government can impose a cashflow tax that depends on  bundles of goods. Each chain of supply should contain at least one formal link  and production specific decisions should be left undistorted. In practice, the government will typically need to invest in infrastructure (e.g.~technology and finance) to expand the set of efficient equilibria and enable formal firms to overcome  any diseconomies of scope that arise due to bundling. 

Informal production forms a large and important part of most developing economies. In the presence of informal production, the usual value-added tax (VAT) or general sales tax GST) may cause distortions along the supply chain. This is partly because informal producers do not declare sales and therefore cannot cannot reclaim VAT paid on inputs and partly because input prices may be distorted by the tax. This is all the more important due to the widespread adoption of the VAT in many countries during the last four decades.

Existing models adopt a general equilibrium approach and restrict attention to linear pricing. In practice, nonlinear pricing is commonly observed and firms are determined through competition in a manner that is better described through a game. We consider a system of Bertrand–Edgeworth markets with free entry, where firms compete both in terms of price and quantity. 

Eliciting beliefs about stock prices
With behavioral finance in mind, my working paper ``Eliciting beliefs about stock prices''  applies the main result of ``Axioms for measuring utility on partial mixture sets''.  
I propose a method for eliciting beliefs about about stochastic processes in continuous time. 

We argue that paths and moments provide  a more tractable alternative to the standard approach of Savage or Anscombe and Aumann, which is better suited to ''small decision worlds". The standard approach takes events as the primitive object of measurement. Beliefs are elicited by asking the agent to choose between pairs of binary options that pay "a dollar" if the event obtains and zero otherwise. 

The issue is that even the simplest events in such infinite dimensional settings are cylinder sets (see figure). I argue even these events are too complicated for most experimental subjects to evaluate. Even for a machine that has learnt the process, eliciting or testing moments should be more informative. 


Exclusion from credit markets when screening is competitive
In this project, we explore the conditions under which exclusion from credit markets can arise in markets where banks compete with one another, screen borrowers and are free to offer nonlinear price schedules. 

Powered by Create your own unique website with customizable templates.