We explore the short- and long-run implications of tax competition between jurisdictions, where g... more We explore the short- and long-run implications of tax competition between jurisdictions, where governments can only tax capital at source. We do this in the context of a neoclassical growth model under commitment and capital mobility. We provide a new theoretical perspective on the dynamic capital tax externalities that emerge in this model. Numerically, we show that the net capital tax externality is positive in the short run but converges to zero in the long run. We also find that noncooperative source-based capital taxes are initially positive and slowly decline toward zero. (JEL D62, H25, H71, H73, H87)
For the most part, I’ll assume that you’ll be working with Matlab. However, I am quite happy for ... more For the most part, I’ll assume that you’ll be working with Matlab. However, I am quite happy for you to use Gauss, Fortran, Scilab, C++ or whatever language you prefer. I will not give advice on what software to use, nor on how to acquire any
For the most part, I’ll assume that you’ll be working with Matlab. However, I am quite happy for ... more For the most part, I’ll assume that you’ll be working with Matlab. However, I am quite happy for you to use Fortran, Scilab, C++, Python, R or whatever language you prefer. I will not give advice on what software to use, except to inform you of the distinction between compiled and
For the most part, I'll assume that you'll be working with Matlab. However, I am quite ... more For the most part, I'll assume that you'll be working with Matlab. However, I am quite happy for you to use Fortran, Scilab, C++, Python, R or whatever language you prefer. I will not give advice on what software to use, except to inform you of the distinction between compiled and
The first thing to note about stochastic optimization (whether static or dynamic) is that it is r... more The first thing to note about stochastic optimization (whether static or dynamic) is that it is really not that hard. Let (Ω,F, P) be a probability space and let Z be an exogenously given random variable. Let G ⊂ F be a σ–algebra. Suppose we want to maximize E[f(X,Z)] with respect to X, subject to the constraint that X must be G–measurable. Proposition. X ∗ solves this problem if, P–almost surely, E[f(X∗, Z)|G] ≥ E[f(X,Z)|G] (1) for all G–measurable random variables X. Proof. Take the unconditional expectation of both sides. Remark. The point of (1) is that you can do the maximization for each ω ∈ Ω separately treating X(ω) as a number rather than a random variable. Moreover, if 1 you choose X(ω) so as to maximize E[f(x, Z)|G](ω), (2) with respect to x, then you are basically guaranteed that X is G–measurable. That is: if there are multiple solutions for some ω and you go out of your way to treat similar problems differently, then you can construct a candidate solution X that maximi...
More precisely, the solution concept is the following. The Ramsey optimal allocation is that allo... more More precisely, the solution concept is the following. The Ramsey optimal allocation is that allocation that delivers the highest weighted sum of utilities among those allocations that form part of some competitive equilibrium. Notice that the proportionality of taxes is built in the concept of competitive equilibrium. Non-proportional taxes mean that people face different after-tax prices, and that is not consistent with competitive equilibrium.
The design and performance of computer vision algorithms are greatly influenced by the hardware o... more The design and performance of computer vision algorithms are greatly influenced by the hardware on which they are implemented. CPUs, multi-core CPUs, FPGAs and GPUs have inspired new algorithms and enabled existing ideas to be realized. This is notably the case with GPUs, which has significantly changed the landscape of computer vision research through deep learning. As the end of Moores law approaches, researchers and hardware manufacturers are exploring alternative hardware computing paradigms. Quantum computers are a very promising alternative and offer polynomial or even exponential speed-ups over conventional computing for some problems. This paper presents a novel approach to image segmentation that uses new quantum computing hardware. Segmentation is formulated as a graph cut problem that can be mapped to the quantum approximate optimization algorithm (QAOA). This algorithm can be implemented on current and near-term quantum computers. Encouraging results are presented on art...
We explore the short- and long-run implications of tax competition between jurisdictions, where g... more We explore the short- and long-run implications of tax competition between jurisdictions, where governments can only tax capital at source. We do this in the context of a neoclassical growth model under commitment and capital mobility. We provide a new theoretical perspective on the dynamic capital tax externalities that emerge in this model. Numerically, we show that the net capital tax externality is positive in the short run but converges to zero in the long run. We also find that noncooperative source-based capital taxes are initially positive and slowly decline toward zero. (JEL D62, H25, H71, H73, H87)
For the most part, I’ll assume that you’ll be working with Matlab. However, I am quite happy for ... more For the most part, I’ll assume that you’ll be working with Matlab. However, I am quite happy for you to use Gauss, Fortran, Scilab, C++ or whatever language you prefer. I will not give advice on what software to use, nor on how to acquire any
For the most part, I’ll assume that you’ll be working with Matlab. However, I am quite happy for ... more For the most part, I’ll assume that you’ll be working with Matlab. However, I am quite happy for you to use Fortran, Scilab, C++, Python, R or whatever language you prefer. I will not give advice on what software to use, except to inform you of the distinction between compiled and
For the most part, I'll assume that you'll be working with Matlab. However, I am quite ... more For the most part, I'll assume that you'll be working with Matlab. However, I am quite happy for you to use Fortran, Scilab, C++, Python, R or whatever language you prefer. I will not give advice on what software to use, except to inform you of the distinction between compiled and
The first thing to note about stochastic optimization (whether static or dynamic) is that it is r... more The first thing to note about stochastic optimization (whether static or dynamic) is that it is really not that hard. Let (Ω,F, P) be a probability space and let Z be an exogenously given random variable. Let G ⊂ F be a σ–algebra. Suppose we want to maximize E[f(X,Z)] with respect to X, subject to the constraint that X must be G–measurable. Proposition. X ∗ solves this problem if, P–almost surely, E[f(X∗, Z)|G] ≥ E[f(X,Z)|G] (1) for all G–measurable random variables X. Proof. Take the unconditional expectation of both sides. Remark. The point of (1) is that you can do the maximization for each ω ∈ Ω separately treating X(ω) as a number rather than a random variable. Moreover, if 1 you choose X(ω) so as to maximize E[f(x, Z)|G](ω), (2) with respect to x, then you are basically guaranteed that X is G–measurable. That is: if there are multiple solutions for some ω and you go out of your way to treat similar problems differently, then you can construct a candidate solution X that maximi...
More precisely, the solution concept is the following. The Ramsey optimal allocation is that allo... more More precisely, the solution concept is the following. The Ramsey optimal allocation is that allocation that delivers the highest weighted sum of utilities among those allocations that form part of some competitive equilibrium. Notice that the proportionality of taxes is built in the concept of competitive equilibrium. Non-proportional taxes mean that people face different after-tax prices, and that is not consistent with competitive equilibrium.
The design and performance of computer vision algorithms are greatly influenced by the hardware o... more The design and performance of computer vision algorithms are greatly influenced by the hardware on which they are implemented. CPUs, multi-core CPUs, FPGAs and GPUs have inspired new algorithms and enabled existing ideas to be realized. This is notably the case with GPUs, which has significantly changed the landscape of computer vision research through deep learning. As the end of Moores law approaches, researchers and hardware manufacturers are exploring alternative hardware computing paradigms. Quantum computers are a very promising alternative and offer polynomial or even exponential speed-ups over conventional computing for some problems. This paper presents a novel approach to image segmentation that uses new quantum computing hardware. Segmentation is formulated as a graph cut problem that can be mapped to the quantum approximate optimization algorithm (QAOA). This algorithm can be implemented on current and near-term quantum computers. Encouraging results are presented on art...
Uploads
Papers by Paul Klein