## Elasticity and DemandMarch 18, 2015

Posted by tomflesher in Micro, Teaching.
Tags: , , , , , ,

The price-elasticity of demand measures how sensitive consumers are to changes in price. There are two primary formulas for that. Most commonly, introductory courses will use $\frac{\% \Delta Q^D}{\% \Delta P}$, where $\% \Delta$ means the percentage change. This means the numerator is $\frac{\Delta Q^D}{Q_0}$, where $Q_0$ is the original quantity demanded and $\Delta Q^D$ is the change in quantity demanded. The denominator is $\frac{\Delta P}{P_o}$. Through the properties of fractions, that ratio is equal to $\frac{\Delta Q^D}{\Delta P} \times \frac{P_0}{Q_o}$, and a lot of students find that formula much easier to use.

A graph of demand and price-elasticity of demand

Take note of the shape of that formula, and keep in mind the Law of Demand, which states that as price increases, quantity demanded decreases. At high prices, quantities are relatively low, meaning that a small change in price yields a relatively big change in quantity demanded. If the percentage change in quantity demanded is bigger than the percentage change in price, then demand is elastic and consumers are price-sensitive. On the other hand, at low prices, quantities are relatively high, meaning that a small change in price yields an even smaller change in quantity demanded. That means demand is inelastic.

This pattern of high prices corresponding to elastic demand and low prices corresponding to inelastic demand holds for most goods. At a very high price, firms can make a small change in price to try to encourage new buyers to buy their product, whereas at a very low price, firms can jiggle the price up a little bit to try to snap up some extra revenue without dissuading most of their buyers from purchasing the product.

A slightly more accurate formula for price-elasticity of demand is $\frac{dQ}{dP} \times \frac{P}{Q}$, which looks surprisingly like the previous formula but doesn’t depend on choosing an original value.

The graph in this post shows market demand $Q^D = 1000 - P$ in blue and elasticity in orange. Note the high elasticity at high prices and low elasticity at low prices.

## Elasticity (SPROING~!)March 17, 2015

Posted by tomflesher in Micro.
Tags: , , , , ,

When we think about elasticity in the real world, we often think about the properties of things like rubber bands or the waists of sweatpants. If a solid has high elasticity, that means it’s very sensitive to having forces applied to it – so while something like Silly Putty or latex is very elastic, other materials like steel or titanium are not. A small amount of force yields a lot of deformation for Silly Putty, but not much at all for steel.

Elasticity in economics works the same way. It measures how responsive one measurement is to a small change in some other measurement.

When economists say “elasticity” without any qualifiers, they typically mean the price-elasticity of demand, which measures how sensitive purchases are to small changes in price. Elasticity, $\epsilon$, is expressed as a ratio:

$\epsilon = \frac{\% \Delta Q^D}{\% \Delta P}$

where $\% \Delta Q^D$ refers to the percentage change in quantity demanded1 (the actual change divided by the starting value) and $\% \Delta P$ refers to the percentage change in price (again, the actual change in price divided by the starting value).2 This leads to three cases:

1. $\% \Delta Q^D > \% \Delta P$ – a small price change yields a big change in quantity demanded. This means that buyers of the good are price sensitive, and (equivalently) demand for the good is elastic. Note that in this case, $\epsilon > 1$.
2. $\% \Delta Q^D < \% \Delta P$ – a small price change yields an even smaller change in quantity demanded. This means that buyers of the good are not price sensitive, and demand for the good is inelastic. Here, $\epsilon < 1$.
3. $\% \Delta Q^D = \% \Delta P$ – a small price change yields exactly the same change in quantity demanded. The term for this type of demand is unit-elastic. When demand is unit-elastic, $\epsilon = 1$.

It’s tempting to treat elasticity as very complicated, when it has a really simply mathematical interpretation. It answers the question “Which change is bigger – price, or quantity?”

Also interesting is the question of why some goods are demanded elastically and some are demanded inelastically. Typically, goods with many alternatives are demanded elastically. Alternatives can come in many forms. Most commonly, they’ll show up as substitute goods, or goods which you can use instead of another good. For example, bread has many substitutes (naan, grits, cornbread, rice, tortillas, English muffins….), and so if the price of bread rose significantly, you’d see many people substituting away from using bread. However, there are other forms of alternatives, too. You may see elastic demand for goods that cost a large proportion of the buyers’ income or that can be purchased over a longer timescale. A college education is an example of both of these – a small change in the level of tuition can lead to big changes in the behavior of students, who will often take a year off to earn money.

Anything with few alternatives will typically be demanded inelastically. Salt is the classic example, because it has no alternatives – it’s necessary for flavoring food, allowing our bodies to function properly, and (in the case of iodized salt) preventing certain illnesses. However, anything that is addictive (like tobacco or heroin), necessary for many uses (like cell phone plans), or difficult to switch away from (it’s not like you can put diesel fuel in your gas-engine car!) will typically have inelastic demand.

1 Quantity demanded means the number of goods people are willing to buy at a certain price.
2 Usually one of these will be negative and the other positive, because of the Law of Demand; economists, ever economical with their notation, simply ignore this and use the absolute value.

## The Do-Nothing AlternativeMarch 16, 2015

Posted by tomflesher in Examples, Micro, Teaching.
Tags: , ,

Consider the following situation: You are at a casino. You have a crisp new $100 bill in your pocket and an hour before your friend arrives. There are several options available: blackjack, poker, and slot machines. Each has its advantages and disadvantages. Blackjack offers a 45% probability that you will double your money over the next hour, but a 55% probability you will lose it all. Based on your understanding of statistics, you know this means you should expect to have about$90 at the end of the hour. Poker is a better proposition – since it is a game of skill, you have a 60% chance of earning an extra $50 (for a total of$150), but a 40% chance of losing all of your money. That means you can expect to have about $90 in your pocket at the end of the hour. Slot machines, to go to the other extreme, are a highly negative expected-value proposition. You stand a 1% chance of winning$1000, but a 99% chance of losing all of your money. As a result, you could expect to have about $1 in your pocket at the end of the hour. Thinking like an economist, you quickly winnow your options down to blackjack or poker, since you cannot abide such a risky proposition. Then, however, you’re stuck – the expected values are the same. Which game is it rational to play? Similarly, consider this problem raised in a freshman course on ethics: You are on your way out of a coffee shop carrying a double shot of espresso and a$1 bill you received as change. Two homeless people, one man and one woman, each step toward you and simultaneously ask you for the dollar. Since you don’t have any coins, you cannot split the value between the two people. Who should you give your dollar to?1

What do these two situations have in common? In each of them, you are attempting to choose between two options that result in negative consequences for you. In the gambling scenario, you have two options, each with the expectation of losing $10. In the coffee shop scenario, you have two people each asking for$1. In neither case is there a compelling reason to choose one option over the other. The underlying assumption, though, is that we must choose an option at all.

The do-nothing alternative is often (but not always) a hidden option when making choices. For example, in the gambling scenario, you have the option to literally do nothing for an hour until your friend arrives. This leaves you $100 with certainty. In the coffee shop scenario, you have the option to politely refuse each person’s request, leaving you free to keep your dollar. Not every situation allows a do-nothing option; for example, a baseball manager faced with the option of starting his worst starting pitcher or a pitcher who is usually used only in long relief cannot opt to simply start no pitcher. However, a voter who is disgusted with all available candidates may bemoan his “forced” vote for the lesser of two evils without acknowledging that he has the option simply not to vote at all. The do-nothing option is often low-cost but has low returns as well, making it a great way to avoid choosing the best of a bad lot, but a lousy choice for a firm seeking growth. —– 1 This was met with considerable debate about the probability that the homeless woman had children. ## The Good and Bad of Goods and BadsJanuary 25, 2015 Posted by tomflesher in Micro, Teaching, Uncategorized. Tags: , , , , , add a comment When students first hear the word “goods” pertaining to economic goods, they sometimes find it a little funny. When they hear some sorts of goods called “bads,” they usually find it ridiculous. Let’s talk a little about what those words mean and how they pertain to preferences. Goods are called that because, well, they’re good. Typically, a person who doesn’t have a good would, if given the choice, want it. Examples of goods might be cars, TVs, iPads, or colored chalk. Since people want this good if they don’t have it, they’d be willing to pay for it. Consequently, goods have positive prices. That doesn’t mean that everyone wants as much of any good as they could possibly have. When purchasing, people consider the price of a good – that is, how much money they would have to spend to obtain that good. However, that’s not because money has any particular value. It’s because money can be exchanged for goods and services, but you can only spend money once, meaning that buying one good means giving up the chance to buy a different one. Bads are called that because they’re not good. A bad is something you might be willing to pay someone to get rid of for you, like a ton of pollution, a load of trash, a punch in the face, or Taylor Swift. Because you would pay not to have the bad, bads can be modeled as goods with negative prices. Typically, a demand curve slopes downward because of the negative relationship between price and quantity. This is true for goods – as price increases, people face an increasing opportunity cost to consume one more of a good. If goods are being given away for free, people will consume a lot of them, but as the price rises the tradeoff increases as well. Bads, on the other hand, act a bit different. If free disposal of trash is an option, most people will not keep much trash at all in their apartments. However, as the cost of trash disposal (the “negative price”) rises, people will hold on to trash longer and longer to avoid paying the cost. Consider how often you’d take your trash to the curb if you had to pay$50 for every trip! You might also look to substitutes for disposal, like reusing glass bottles or newspaper in different ways, to lower the overall amount of trash you had to pay to dispose of.

As the cost to eliminate bads increases, people will suffer through a higher quantity, so as the price of disposal increases, the quantity accepted will also increase.

## Anecdote Alert: Do restaurant deposits depress attendance?January 1, 2015

Posted by tomflesher in Examples.
Tags: ,

Last night I spent New Year’s Eve at one of my favorite restaurants, Verace in Islip, New York. I actually did New Year’s Eve there last year, too, and there were three very interesting changes. The upshot is that the restaurant, though it had a fantastic menu, was significantly less full than it was last year, and the crowd skewed slightly older.

First, the price of the dinner was $65 last year and$85 this year. That corresponds to about a 30% price hike. That might deter some people, but I’m skeptical. The price-elasticity of demand for restaurant meals is about 2.3, or very elastic. (That means that if the price of a restaurant meal changes by 1%, the quantity of restaurant meals sold would drop about 2.3%.) If that’s the correct elasticity to apply here, that would explain a 69% drop in attendance, but I’m not so sure that restaurant meals on New Year’s are as elastic as restaurant meals during the rest of the year. The well-known Valentine’s Day Effect causes price elasticity for certain goods (cut roses) to drop on Valentine’s Day, and since a meal at home isn’t a close substitute for a restaurant meal on a special occasion, I’m skeptical that this price change would explain the precipitous drop in attendance.

Second, the restaurant required a deposit this year – $50 per person, returned at the beginning of the meal as a gift card. This was my first hypothesis, but I’m not sure it’s much of an explanation. For one thing, I put down my deposit on Monday, so there was no real loss of value.$50 per person to hold a spot is well within the income for most demographics that you see at Verace most nights [more on this in a moment], especially since it operated as a credit on the bill. No dice here, really.

Third, this might be the big one – Verace is part of the Bohlsen Restaurant Group, which operates a couple of restaurants at slightly different price points. This year, BRG made a big deal of advertising different, keyed experiences at their different restaurants. Specifically, Teller’s was a much more expensive steakhouse offering, Verace was a meal only, but Monsoon – their lower-priced, Asian fusion restaurant – had a modular menu with options of $75 for a meal (a bit cheaper than Verace, but not much) and$75 for an open bar. The open bar and Monsoon’s dance floor almost surely made it more attractive to younger revelers. That also explains the shift in demographics – Verace’s younger crowd may have been cannibalized by another BRG restaurant.

In the alternative, our waiter’s hypothesis: The manager did a great job seating people. “This guy,” says he, “is a magician.” He may be, but I’m more interested in seeing Monsoon’s numbers.

## Uncertainty Over Time (Lords)November 25, 2013

Posted by tomflesher in Micro.
Tags: , , ,

The previous post introduced a problem that arose in the Doctor Who 50th Anniversary special in which uncertainty was a core element. To explore that problem in depth, we’ll need some tools to work with uncertainty. Also, that problem was a little grim, so let’s take the edge off.

Let’s say that a friend of mine, Matt, is visiting and I need to order food. Matt’s favorite food is fish custard, so it makes sense that I should order him a dish of fish custard. Simple enough. Of course, if my other friend Peter were coming over, and I knew Peter didn’t like fish custard and much preferred haggis, it wouldn’t make sense for me to order him fish custard. Obviously I’d order haggis for him. To make this work mathematically, let’s say that each friend of mine likes his preferred dish about the same, so we could say that Matt gets utility of uM(fish custard) = 1 from eating fish custard and uM(haggis) = 0 from eating haggis, and Peter gets the reverse – uP(fish custard) = 0 and uP(haggis) = 1. If I want to maximize my friend’s utility, then I should do so by buying each friend his favorite dish. But what if I don’t know who’s coming over?

Intuitively, it makes sense – if Matt’s more likely to come over, I should order fish custard. If Peter’s probably on the way, haggis should go on the menu. If they’re equally likely, flip a coin. More mathematically, if the probability that Matt is coming is πM and the probability that Peter is coming is πP (and πM + πP = 1), the expected utility of my guest when I order fish custard is

E[u(fish custard)] = πMuM(fish custard) + πPuP(fish custard),

which reduces to

πMuM(fish custard) + πP(0).

Since uM(fish custard) = 1, then the expected utility of my guest from fish custard is just πM. For haggis, of course, the same logic applies – Matt gets 0 utility, so we can ignore him, and the expected utility ends up being πP. So, whichever friend of mine is more likely to show up should get his favorite dish ordered, and if I don’t know who’s coming, I might as well just draw names from a hat.

This gets a little bit more complicated if both of my friends like each dish. So, let’s consider what happens if uM(fish custard) = 1 and uM(haggis) = 0, but if uP(fish custard) = 0.5 and uP(haggis) = 1. Then, our calculation for the expected u(haggis) stays the same (it’s still πp) but now our E[u(fish custard) ends up being

πMuM(fish custard) + πPuP(fish custard) = πM*(1) + πP*(.5)

So now, in order to figure out what dish to order, we need to know what the probabilities are! If it’s fifty-fifty, then E[u(haggis)] is .5, but E[u(fish custard)] is .5 + .5*.5 = .75! In that case, we should order fish custard, even though the guys are equally likely to show up, since Peter likes fish custard a little bit, too. We don’t get to flip a coin (mathematically, we don’t reach our indifference point) until

πMuM(fish custard) + πPuP(fish custard) = πMuM(haggis) + πPuP(haggis)

or

πM(1) + πP(.5) = πM(0) + πP(1) => πM + πP(.5) = πP => πM = .5*πP

That means we don’t get to the indifference point until Matt is twice as likely to arrive as Peter! Funny how these probabilities influence the choices we’ll make. Sometimes things get a bit more complicated, but that’s a post for another time.

… lord.

## SPOILERS: The Tenth and Eleventh Doctors’ ProblemNovember 24, 2013

Posted by tomflesher in Uncategorized.
Tags: , , ,

Major, major spoilers here. If you haven’t yet seen the Doctor Who 50th Anniversary Special, please avoid this post.

During the 50th Anniversary Special episode of Doctor Who, there was an interesting scene where a group of characters and their malevolent clones were negotiating a treaty that would determine, in part, whether the Zygons (an empire of malevolent aliens) would take over the Earth. This presents some quite interesting problems that were solved by some quite interesting guys – the Tenth and Eleventh incarnations of the Doctor.

Obviously each side had opposite incentives. The Zygons needed a planet to live on, and the human population did as well. Take for granted that the populations can’t coexist and you can see that this is what’s known in economics as a zero-sum game. Since they would not agree to share the planet, one side would have to come out the winner in the negotiations, with the other losing. Similarly, since each Zygon cloned a human negotiator, it inherited at least some of its human’s memories, meaning that we can assume symmetric information. In cases like this, it’s usually up to the agent with the higher valuation of the asset being negotiated to make some sort of concession to the agent with the lower valuation in exchange for agreeing to give up any claims – in other words, pay them to go away. In other cases, the agents will agree to let an arbitrator make the decision for them, under the assumption that the arbitrator won’t be clouded by personal decisions and will make the “best” choice, for some value of “best” to be determined.

The Doctors (played by David Tennant and Matt Smith) opted for an entirely different (and clever) solution, albeit one that wouldn’t work in the real world: they wiped the humans’ and Zygons’ memories so that each would forget which side they were on, allowing for efficient resolution of the problem. The theory, which is similar to Richard Rorty’s Veil of Ignorance, is that people who don’t know whether their side stands to benefit will reach an equitable solution, if not one that they might have argued for in the first place. As such, it’s an example of hidden information. The same thinking generates the idea in law and economics of efficient breach. This might result in the solution that’s efficient in the economic sense, but it probably won’t lead to anyone being the best off they could be.

## Shockingly, Economists Price Things IntelligentlyMay 27, 2013

Posted by tomflesher in Uncategorized.
Tags: , ,

I was excited to see that there’s a new edition of Recursive Macroeconomic Theory by Lars Ljungqvist and the Nobel Prize winning Tom Sargent. I was even more excited to see prices starting to fall into place with respect to the Kindle edition.

Textbooks are a special product because demand for them has historically been inelastic: if you need the book, you need it. You won’t have any choice as far as whether to buy it. That’s led price-sensitive consumers to buy used textbooks, but there’s a limited supply. (Of course, even more price-sensitive consumers download the books illegally.) There’s also a pretty big market in old editions, even though there are usually slight differences between editions.

Recursive macro is a pretty standard subject, and although the third edition of Ljungqvist and Sargent includes new material, it probably won’t appreciably change the experience for a first-year grad student taking grad macro. There’s some benefit to the new edition, sure, but many students will be faced with the following choice: Pay $82.44 (as of today) for a new third edition, or pay$49.99 for a second edition. That’s a significant savings. However, second-edition sales aren’t beneficial to the publisher, so how can they create an incentive for price-sensitive buyers to give money to the publishers rather than to the secondary market?

Well, a Kindle version of the third edition is $51.29. Even price-sensitive students would probably find$1.30 a fair price to pay for an up-to-date edition, assuming they already had a Kindle-equipped device like an iPad. As used copies of the third edition accumulate, the prices of those will probably collect around the $50 mark as well, and if they don’t, I’d expect the price of the Kindle version to float along with that. Make it easy on price-sensitive consumers to give you money instead of giving it to the secondary market, and some of them will. ## Don’t Discount the Importance of PatienceApril 9, 2013 Posted by tomflesher in Micro, Teaching. Tags: , , add a comment Uncertainty is one explanation for why interest rates vary. Tolerance for uncertainty is called risk aversion, and it can be pretty complicated. (We’ll talk about it a little bit later on.) Another big concept is patience. Willingness to wait is also pretty complicated, but that’s our topic for today. It’s easy to imagine some reasons that people would have different levels of patience. For one, you’d expect a healthy thirty-year-old (named Jim) to be more patient than a ninety-year-old (named Methuselah). What if someone (named Peter) offered us a choice between$100 today or a larger amount of money a year from now? How much would it take for Jim and Methuselah to take the delayed payoff? Would they take $100 a year from now? A lot can change in a year: • There could be a whole bunch of inflation, and the$100 will be worth less next year than it is now. Boom, we’ve lost.
• We could put the money in the bank and earn a few basis points of interest. Boom, we’ve lost.
• We could die and not be able to pick up the money. Boom, we’ve lost.
• Peter could die and we wouldn’t be able to collect. Boom, we’ve lost.

Based on these, we’ll want a little bit more money next year than this year in order to be willing to take the money later instead of the money now. Statistically, though, Jim is more likely than Methuselah to be there to pick up the money.

Neither would take any less than $100 next year, but that’s just a lower limit. According to Bankrate.com, Discover Bank is paying 0.8% APY, which means that the$100 would be worth .8% more next year – just by putting the money in the bank, we can trade the risk that the bank goes bust (really unlikely) for the risk of Peter dying. That’s an improvement in risk and an improvement in payoff, so there’s no reason to take any less than $100.80. Again, though, this is a lower bound. Peter still has to pay for making them wait. That’s where the third point comes into play. Methuselah is probably not going to live another year. It’s much more likely that he’ll get to spend the$100 than whatever he gets in a year; in order to make it worth the wait, the payoff would have to be huge. Methuselah views money later as worth a lot less than money today. He might need $200 to make it worth the wait. Jim, on the other hand, might only need$125. He has more time, so he’s much more patient.

This level of patience is called a discount rate and is usually called β. You can do this sort of experiment to figure out someone’s patience level. You’d then be able to set up an equation like this, where the benefit is the $100 and the cost is what you give up later: $100 = \beta \times Payoff$ Methuselah, then, would have $100 = \beta \times 200$ so β = 1/2. Jim would have the following equation: $100 = \beta \times 125$ so β = 4/5. Based on this, we can say that Methuselah values money one year from now at 50% of its current value, but Jim values money one year from now at 80% of its current value. Everyone’s discount rate is going to be a little bit different, and different discount rates can lead people to make different choices. If Peter offers$100 today or $150 tomorrow, Jim will wait patiently for$150. Methuselah will jump at the $100 today. Both of them are rational even though their choices are different. ## Evaluating Different Market StructuresDecember 13, 2012 Posted by tomflesher in Micro, Teaching. Tags: , , , , , , , , , , add a comment Market structures, like perfect competition, monopoly, and Cournot competition have different implications for the consumer and the firm. Measuring the differences can be very informative, but first we have to understand how to do it. Measuring the firm’s welfare is fairly simple. Most of the time we’re thinking about firms, what we’re thinking about will be their profit. A business’s profit function is always of the form Profit = Total Revenue – Total Costs Total revenue is the total money a firm takes in. In a simple one-good market, this is just the number of goods sold (the quantity) times the amount charged for each good (the price). Marginal revenue represents how much extra money will be taken in for producing another unit. Total costs need to take into account two pieces: the fixed cost, which represents things the firm cannot avoid paying in the short term (like rent and bills that are already due) and the variable cost, which is the cost of producing each unit. If a firm has a constant variable cost then the cost of producing the third item is the same as the cost of producing the 1000th; in other words, constant variable costs imply a constant marginal cost as well. If marginal cost is falling, then there’s efficiency in producing more goods; if it’s rising, then each unit is more expensive than the last. The marginal cost is the derivative of the variable cost, but it can also be figured out by looking at the change in cost from one unit to the next. Measuring the consumer’s welfare is a bit more difficult. We need to take all of the goods sold and meausre how much more people were willing to pay than they actually did. To do that we’ll need a consumer demand function, which represents the marginal buyer’s willingness to pay (that is, what the price would have to be to get one more person to buy the good). Let’s say the market demand is governed by the function QD = 250 – 2P That is, at a price of$0, 250 people will line up to buy the good. At a price of $125, no one wants the good (QD = 0). In between, quantity demanded is positive. We’ll also need to know what price is actually charged. Let’s try it with a few different prices, but we’ll always use the following format1: Consumer Surplus = (1/2)*(pmax – pactual)*QD where pmax is the price where 0 units would be sold and QD is the quantity demanded at the actual price. In our example, that’s 125. Let’s say that we set a price of$125. Then, no goods are demanded, and anything times 0 is 0.

What about $120? At that price, the quantity demanded is (250 – 240) or 10; the price difference is (125 – 120) or 5; half of 5*10 is 25, so that’s the consumer surplus. That means that the people who bought those 10 units were willing to pay$25 more, in total, than they actually had to pay.2

Finally, at a price of \$50, 100 units are demanded; the total consumer surplus is (1/2)(75)(100) or 1875.

Whenever the number of firms goes up, the price decreases, and quantity increases. When quantity increases or when price decreases, all else equal, consumer surplus will go up; consequently, more firms in competition are better for the consumer.

Note:
1 Does this remind you of the formula for the area of a triangle? Yes. Yes it does.
2 If you add up each person’s willingness to pay and subtract 120 from each, you’ll underestimate this slightly. That’s because it ignores the slope between points, meaning that there’s a bit of in-between willingness to pay necessary to make the curve a bit smoother. Breaking this up into 100 buyers instead of 10 would lead to a closer approximation, and 1000 instead of 100 even closer. This is known mathematically as taking limits.