Of That

Brandt Redd on Education, Technology, Energy, and Trust

12 August 2010

Quote of the Day: Bil Lepp

When you can separate individuals from institutions you often find real people.
- Bil Lepp From Punching the Lard

07 July 2010

Energy: The Future is Nuclear

In my first blog post on energy I calculated that worldwide energy production must increase to about 734 exajoules per year in order to raise the standard of living for most people to a reasonable level of comfort. This compares with current energy production of around 474 exajoules per year. It also assumes massive reductions in energy consumption in countries like the United States and Canada.
My second post on energy detailed the cost of energy from existing sources and the prospects of using each to meet the energy needs of the developing. Notably, wind and solar are by far the most expensive sources of energy and their environmental impact isn't as neutral as they've been portrayed.
The least expensive source of energy is nuclear — beating even hydroelectric power. But we need some changes to the nuclear economy based on technology improvements. We can't continue using the predominant form of nuclear fission without a long-term waste storage plan and safer reactor designs.
I'm following four innovative approaches to nuclear energy. Any one of these, if proven viable, promises to offer abundant, cheap and clean energy that can be sustained for millennia.
Fast-Neutron Nuclear Fission
Fast Neutron Breeder
Credit: TerraPower
Nearly all nuclear reactors presently used for energy production are thermal reactors which use slow-moving or "thermal" neutrons. The advantages of these reactors are that they can use low-grade nuclear fuel (moderately enriched uranium), they can use water as a coolant and it is difficult to misuse them to create nuclear weapons. Many design variations exist from those of questionable safety like Chernobyl to reliable designs that have operated for many decades. The trouble with thermal reactors is that they require enrichment of uranium ore and they produce nuclear waste that remains dangerously radioactive for thousands of years.
In contrast, fast-neutron reactors require more highly-enriched fuel (increasing the risk of weaponization) and more exotic coolants like liquid sodium. However, they have three big advantages. First, the waste from a fast-neutron reactor has a much shorter half-life and requires storage for only a few hundred years. Second, the fast neutrons can be used to enrich uranium to produce more fuel than the reactor consumes. Third, the fast neutrons can be used to reprocess the nasty waste from thermal reactors resulting in a mix of new nuclear fuel and short half-life waste.
Many research groups are pursuing variations on the fast-neutron design that capitalize on these advantages while managing the problems of weaponization and exotic coolants. One approach is to have most reactors of the thermal design while a few fast-neutron reactors reprocess and produce fuel for the rest. However, such a nuclear economy requires a lot of transportation and processing of radioactive materials.
A traveling wave reactor is a variation on the fast-neutron design that is pre-loaded with a small amount of enriched fuel to get it going and filled the rest of the way with unenriched feedstock. The reaction starts in the enriched section with the fast neutrons enriching the neighboring fuel. The "wave" of the reaction moves from the pre-enriched section through the newly-enriched area until all fuel has been consumed.
TerraPower is working on a traveling wave design that could be pre-loaded with enough fuel to last 60 to 100 years. A small amount of enriched fuel (the dangerous stuff) would be loaded with a large quantity of depleted uranium (plentiful and safe to transport) and the whole system buried. When the reactor eventually "burns out" the short half-life waste might be left buried in place while a new reactor takes over.
Polywell Fusion
Polywell
Credit: EMC2 Fusion
Nuclear Fusion has long been the holy grail of energy production. It's the primary reaction fueling our sun and the stars. For fusion you take two hydrogen atoms and fuse them using high temperature and pressure to create helium and a lot of energy. The advantages of fusion over fission is that the fuel is plentiful -- hydrogen extracted from seawater being one option -- and the waste is inert helium. It should be noted, however, that most fusion reactions release radiation so the reactor itself must still be shielded.
The trouble is that maintaining a controlled reaction has proven to be very difficult. IEC Fusion is one of several approaches that is gaining attention over the more conventional and extremely expensive tokamak.
Inertial Electrostatic Confinement Fusion was invented by Philo T. Farnsworth who also invented television. Farnsworth's idea was to place a grid in a vacuum chamber with a strong negative charge. When hydrogen ions are released into the chamber they are accelerated toward the grid and some percentage of them collide in the center with sufficient energy to fuse into helium.
The Fusor, as Farnsworth called his device, has been proven to generate fusion. Building one is relatively simple and inexpensive. Many hobbyists have built their own. However, current designs consume considerably more energy than they produce. The main energy leak is that many of the ions collide with the grid itself consuming some of the charge and contaminating the plasma with the products of the (non-nuclear) grid collision.
For a little more than a decade, Dr. Robert Bussard quietly researched ways to overcome problems with the fusor. His device, called the Polywell, makes the grid out of coils. An electrical current in the coils creates a magnetic field that guides the ions around it and prevents collisions. He and his team made several important breakthroughs shortly before their U.S. Navy funding ran out. At that point he gave a famous talk at Google in which he detailed the progress they had made and sought funding to continue the research. Unfortunatly, Dr. Bussard died of natural causes before funding was renewed. Thankfully, Dr. Richard Nebel, has obtained funding and continued the work. So far the results are promising and he expects to have proven whether the concept is viable within two years.
Focus Fusion
Dense Plasma Focus
Credit: Lawrenceville Plasma Physics
The Dense Plasma Focus device creates a toroidal plasma by discharging a high voltage arc in a near-vacuum. The electrical and magnetic fields in the plasma torus cause it to collapse into a very dense-hot formation called a plasmoid. Under the right conditions, the plasmoid is dense and hot enough to create nuclear fusion. The fusion reaction releases heat, x-rays and high-velocity ions. The trick is to capture all three of these products in such a way as to generate electricity.
Lawrenceville Plasma Physics claims that they have a reactor design that will effectively capture sufficient energy to be a viable clean source of nuclear energy.
Most fusion research focuses on the Deuterium-Tritium reaction (Deuterium and Tritium are both isotopes of Hydrogen). That's because it's the easiest fusion reaction to achieve because it requires the least energy. However, both IEC Fusion and Focus Fusion have potential to work with other reactions because increasing the heat is mostly a matter of raising the voltage. This raises the possibility of using a Hydrogen Boron reaction. The advantage of this is that when Hydrogen and Boron fuse they release three Helium atoms and a bunch of energy but no neutron radiation. Thus, a hydrogen-boron reactor wouldn't require heavy shielding.
Steam Fusion
General Fusion
Credit: General Fusion
General Fusion proposes to inject a small amount of deuterium-tritium mixture at the center of a sphere of liquid metal. The outside of the sphere is simultaneously struck by hundreds of rams which create a spherical shock wave. When the wave reaches the center it compresses and heats the D-T mix sufficiently to generate fusion.
When I first read about the idea, the researchers proposed to use mercury for the liquid metal and steam to drive the pistons. Hence the moniker, "Steam Fusion." The current General Fusion design uses pneumatic pistons and a hot lead-lithium mix for the metal. They call it MTF Fusion but I think Steam Fusion is more catchy.
If proven viable, any of these approaches might deliver plentiful, clean, safe and inexpensive energy. And that combination could bring about societal changes rivaling the Industrial Revolution. All four appear viable on paper which means that there is a very good chance that one or two can be proven viable. The Nuclear Age has been long in coming but I think it's almost here.
Other posts in this series:
Scotty, We Need More Power!
Increasing Energy Production

03 June 2010

Quote of the Day: Juliet B. Schor

"We've got the worst of Capitalism and Socialism. We have private gains and socialized losses."
- Juliet B. Schor Interview on the Diane Rehm Show at approx 19:45

14 May 2010

Increasing Energy Production

In my last post in this energy series I determined that in order to alleviate world poverty we need to increase worldwide annual energy production from 474 exajoules (total production in 2008) to 734 exajoules (108 gigajoules per person per year). That's a difference of 260 exajoules.
This number is probably low because it assumes a 50% reduction in energy consumption in the U.S and Canada and it is based on today's population. The U.S. may not be able to achieve such efficiencies and worldwide population is certainly going to increase. For the sake of the following calculations I chose a target of producing an additional 350 exajoules.
The U.S. Energy Information Administration offers the following breakdown of worldwide energy production in 2006 (the latest year for which they've published data).
I've converted from BTUs to Joules. The total comes to 495 exajoules which is a little higher than the 474 cited in my previous post but the numbers are close enough to work with.
So, here are the prospects for generating an additional 350 exajoules from various sources:
Petroleum and Natural Gas
One gigajoule from oil costs $13.56.
One gigajoule from gas costs $4.74.
The Peak Oil theory dates back to 1956. It suggests that there will come a day where the remaining oil reserves are too expensive to extract and worldwide petroleum production will be forced to decline. Current projections are that peak oil will be reached on or before 2020. But the peak oil year has been moved back several times and there is good evidence that it's still a long way off.
Regardless of whether oil and gas reserves are nearing exhaustion, there are other problems with petroleum. Foremost is pollution. I'll defer debate about carbon dioxide as a pollutant to other authors. There still remain other pollutants including sulfur oxides, nitrous oxides, carbon monoxide and so forth. Natural gas burns more cleanly than crude oil products but it still generates pollutants. New automotive technology has reduced oil emissions to a fraction of their former levels. But these gains have been achieved in industrialized countries where regulations have encouraged such developments. In the developing world, emissions are much worse though the extent isn't accurately measured.
The other problem with petroleum and natural gas is opportunity cost. Presently we have no good alternative energy source for transportation. If we consume petroleum to generate electricity and heat, the cost of transportation will be driven up.
Due to the transportation link, petroleum use will be around for a long time. But, it's hard to consider massive increases in oil and gas consumption as a sustainable solution for meeting poverty's energy needs.
Coal
One gigajoule from coal costs $3.24.
Among fossil fuels, coal is the low-price leader. For this reason, coal supplies 49% of electricity in the United States, 69% in China and 40% worldwide. In the United States it is estimated that enough coal is recoverable to last 146 years at current growth rates.
So, there is enough coal to last for quite a while and it is inexpensive. But as with Oil and Gas, pollution is a problem. For those concerned about carbon dioxide emissions, coal releases about 35% more CO2 than gas or oil for the same amount of energy. More concerning to me are emissions of soot and sulfur dioxides. In the United States, scrubbers, are used to keep emissions relatively clean. However, most Chinese plants lack such scrubbers and China burns more coal than the United States, the European Union and Japan combined.
Hydroelectric
One gigajoule from hydropower costs $2.36.
Hydroelectric power is a nearly perfect solution. It's renewable, non-polluting, extremely efficient and it can be stored (in the form of reservoirs) until needed. However, though they don't pollute, dams and reservoirs have considerable environmental impact. In the United States, we've already harnessed just about all the hydropower available. More might be available in the developing world but not enough to deliver the needed 350 exajoules.
Wind
One gigajoule from a wind farm costs $13.89.
With new technology, the cost of wind power has dropped by more than 80% in the last two decades. Despite that improvement, it's the second most expensive source of energy on this list. The amount of wind energy that can be generated per acre varies tremendously with the amount and consistency of wind in that area. A representative example is the mega windfarm proposed by T. Boone Pickens which would produce 4,000 megawatts from 200,000 acres. That works out to approximately 630 gigajoules per acre per year (assuming that the 4,000 megawatts is average production). Unfortunately I suspect that 4,000 megawatts is peak production during ideal wind conditions. Giving the benefit of the doubt and assuming 4,000 megawatts is average, this kind of wind energy density can supply the total energy needs of six people per acre.
To be clear, I'm using the total energy need per person, not just electricity. It includes lighting, heat, cooling, transportation and the energy required to manufacture and produce all goods used by that individual.
While wind will make important contributions to overall energy production, the cost of production and low energy density per acre prevent it from being more than a small contributor to the overall solution.
Solar
One gigajoule from solar-voltaic panels costs $83.33.
Direct Insolation is the amount of solar energy delivered per square meter per day. In my city of Provo, Utah it averages 4.64 kWh/m^2*day. That works out to 6.1 gigajoules per square meter per year. The best solar cells ever tested achieve 41.6% efficiency in the laboratory. However, using solar cells of practical cost without tracking or concentration systems, the best efficiency to be expected is about 5%. Assuming these parameters, it would take 354 square meters of solar panels to supply the energy needs of one person. The roof of a typical suburban home is about 150 square meters.
This shows that the energy density of solar power starts to approach practicality. Presently, the big barrier is the cost of manufacture. At Today's Prices, a 354 square meter solar array would cost approximately $1.4 Million. This explains why solar power is far and away the most expensive source.
Solar-voltaic technology is advancing rapidly. The cost of manufacture is dropping and the efficiency is climbing. There are other solar technologies such as passive solar heating, solar water heating and solar concentrators which may cost less than solar-voltaic systems. There are also problems. Solar power is not consistent which means energy storage or alternative sources are needed for night and cloudy days.
Solar power--particularly solar-voltaic panels--is ideally suited to urban rooftops. Not only do panels deliver peak power during peak electrical demand (for air conditioning) but by converting light into electricity they reduce heat uptake on the roof thereby reducing the air conditioning load in the summer. However, for this to be practical, cost of manufacture would have to be reduced to about 5% of current costs. That's a tall order.
Nuclear
One gigajoule from nuclear power costs $1.42.
Nuclear power is the least expensive energy source available. The amount of energy that can be produced is tremendous, there is enough fuel to last millennia and the environmental impact is the least of all the energy sources cited.
The problem with nuclear power is that while the actual environmental impact is very small the perceived environmental impact is large and, in an accident like Chernobyl, the potential impact is tremendous. Clearly the perception of environmental impact is due to the potential for disaster. This has resulted in a political atmosphere that has severely limited the construction of new power plants since the Three Mile Island incident.
Another problem with nuclear power is the prospect of repurposing a power plant or its waste products into nuclear weapons.
New technologies for nuclear power are emerging that address the potential for disaster as well as limiting the prospects for repurposing. Those will be the subject of my next post on energy.
Energy prices were obtained from the following sites. Each was converted into dollars per gigajoule As prices fluctuate with the market the numbers you see as you follow the links may not be the same ones I used. Likewise, prices or some sources like coal can vary by as much as 60% depending on region. In each case, I selected prices that seemed to represent the majority of the market.
Energy Content of Fuels
Spot Price of Oil
Spot Price of Natural Gas
Spot Price of Coal
Hydroelectric Energy Cost
Wind Energy Cost
Solar Energy Cost
Nuclear Energy Cost (In Europe)

Other posts in this series:
Scotty, We Need More Power!
Energy: The Future is Nuclear

12 May 2010

Quote of the Day: Anna Stone

"Improvisation, if you don't have something worthwhile to say, is just hot air."
(Anna Stone, English Teacher Extraordinaire speaking on the importance of knowing the subject and the historic context before relying on oratory skills.)

30 April 2010

Provo: Awesome or Boring?

The April 24 edition of Forbes listed my city of Provo, UT as the second best city in the United States for Business and Careers (following Des Moines, IA). Meanwhile, Portfolio.com rated Provo the least fun among the 100 largest cities in the U.S.

If you believe the rankings, "good for business and careers" and "fun" seem to be contrary pressures. For example, New York City ranks #1 on the Portfolio "fun" list and #99 on the Forbes "business" list. Nearly an exact reversal of Provo's ranking (#2 for business and #100 for fun). However, I think that the Portfolio ranking is badly flawed. Their categories for ranking are Shopping, Gambling, Popular Entertainment, Culture, Food and Drink, Low-Impact Sports and High-Impact Sports.

There's little question that Provo's not a very good gambling destination (#92 on their list) and I also have a hard time disputing New York's #1 rank for shopping and culture. But how do they get away with ranking Provo as #98 for high-impact sports (represented by an icon of a skier) vs. New York's #2 ranking? I can be on the slopes at Sundance 20 minutes from leaving my front door. And best-in-the-world resorts like Alta, Snowbird, Brighton, Solitude, Deer Valley and Park City are all within an hour's drive. Where do New Yorkers go to ski, much less hike, mountain bike, camp, drive off-road, fish and so forth?

Oh well, I like New York too. And despite it's #99 business ranking I think a brokerage would be better off locating in New York than in Provo. There's a lot more subjective influence than these rankings would suggest.

26 April 2010

Toxic Assets Revisited

I just attended a fascinating lecture by Dr. Hal Heaton who was my MBA Business Finance teacher 16 years ago. He outlined what he called the "Perfect Storm" of events that lead to our current financial crisis. Much of what he had to say is summarized in this business case though his live presentation included some nice graphs illustrating many of the financial trends.

Following the lecture I asked him about my theory that the markets could self-correct the problem of toxic assets (outlined in my previous blog post on this subject). It turns out that he has been serving as an expert witness in several lawsuits related to the meltdown and has direct experience in this area. He assured me that, indeed, the derivatives market has mostly shut down and that the remaining derivative instruments are treated as the risky instruments they really are.

According to Dr. Heaton, one lingering problem is that the Community Reinvestment Act that I talked about in my history of the banking crisis remains in place along with enhancements that were passed in 1999 and 2005. Presently the provisions aren't being enforced but if they are, banks will be required to continue to issue the kind of high-risk loans that helped create this problem in the first place.

From his primary lecture I learned that Dr. Heaton views the sub-prime lending and the associated financial derivatives as only two components in a six-part "perfect storm." Here's the full list.

  • High-risk mortgages spurred on by the Community Reinvestment Act and it's more recent kickers. (The requirements remain in place though they aren't currently being enforced).
  • Enormous increase in the money supply with interest rates reduced to nearly zero. (Rates are still there.)
  • Hybrid mortgages that had a two-year low introductory rate. Homeowners expected to be able to refinance after two years because "home prices always go up as they had done almost continuously for the 40 years preceding 2008. (Many of these have already been foreclosed upon but there remain several waves of ARMs yet to create problems.)
  • Asset Securitization -- the financial derivatives used to finance high-risk mortgages and an enormous variety of other investments. (Mostly out of favor.)
  • The transfer of manufacturing to China and other emerging markets. This results in an enormous trade deficit. Under normal circumstances, such a deficit would strengthen the yuan and weaken the dollar thereby bringing things into balance. But the Chinese government, not wanting to slow the growth, purchases dollars from manufacturers in exchange for Yuan and then invests those dollars in US Treasuries. (The recession has reduced the trade deficit by half but it remains tremendously high by historic standards.)
  • The complicity of Moody's and Standard and Poor's in giving excessively high ratings to mortgage-derived securities based on the incorrect assumption that housing prices would not decline. (This has been corrected.)
Possibly even more concerning is Dr. Heaton's Assertion that many of the "rules of economics" he taught me those years ago have been violated in ways he would never have foreseen. Examples: The money supply has been tripled but interest rates and inflation remain extremely low. We've been able to sustain an enormous trade deficit without currency corrections. The Fed has been purchasing treasuries and yet the sky hasn't fallen.

We are in unprecedented territory. What happens next is anybody's guess.

21 April 2010

Quote of the Day: Plastic

"Humanity's plastic footprint is probably more dangerous than its carbon footprint." -- Charles Moore, Ocean Researcher
Source

20 April 2010

Toxic Assets

With health care seemingly out of the way, congress is turning its attention to finance reform. Last fall I posted my summary of the crash of 2008. I think there's little doubt that reforms need to be made in the financial markets. However, I've been wondering if those reforms need to be brought about through legislation or if there might be another way.

Most of the blame for the financial crisis has been leveled at investment banks and other institutions that hid risky investments behind complicated financial instruments. However, Credit Rating Agencies like Moody's and Standard and Poor's were complicit in creating the problem because, like the banks, they ignored the possibility of market-wide problems.

So, just as the cooperation of Credit Rating Agencies helped create the problem, CRA's could likewise drive much of the reform. Unfortunately, there continue to be allegations of inflated ratings and the agencies have avoided liability for past mistakes. Despite this I have hope that reform can come from this sector without legislative pressure.

The term "Toxic Asset" was invented in 2008 to describe the financial derivatives for which a value cannot be determined with confidence. The presence of large quantities of toxic assets on corporate and bank balance sheets froze the financial markets. There's are markets for high and low-risk securities. But when a risk or value cannot be determined with confidence, that's when markets freeze up. "Toxic Asset" is very descriptive term for such things.

What I would like to see is an agency that would report on the portion of a security -- stock, bond, or derivative -- that is composed of questionable derivatives. To do so with accuracy would require cascading fractions through the network of ownership. For example, if 15% of a bank's balance sheet is composed of toxic assets and 20% of a mutual fund is invested in that bank then the mutual fund would be rated 3% toxic (15% * 20% = 3%). Of course, some portion of other stocks in the mutual fund might also be considered toxic so the total toxicity of the mutual fund might be higher.

Creating a database that tracks the network of ownership would be complicated but not impossible. The information required is all in the public record. There would have to be a objective way of determining whether a fundamental asset is toxic. However, once the system is in place, it could also be used to rate cascading ownership in many other types of assets. Fractional ownership in business sectors such as manufacturing, education or hospitality could be measured through the cascading layers. Involvement in totalitarian regimes, conflict assets or vice business could also be tracked.

I suppose this is another of my Business Concepts. It would take a considerable up-front investment and a continuing investment to maintain the database but the ability to analyze cascading ownership would be a potent investment tool.

16 April 2010

Scotty, We Need More Power!

Energy production and standard of living are directly connected. My neighbor, Dr. L. Douglas Smoot has a presentation (as yet unpublished) that he's made to various audiences in the last year. His thesis is that in order to raise the standard of living for impoverished nations we have to raise the corresponding energy production. That's because energy is required for the production of food, for manufacture of goods, for the treatment of illness, for the management of indoor temperature and for the transportation of everything.


The graph above, extracted from this excellent Department of Energy study shows the correlation between the Human Development Index (a measure of standard of living) and per-capita energy use. It's arguable that the energy consumption of U.S. citizens could be reduced while still maintaining a comfortable lifestyle. Nevertheless, per-capita energy production of developing countries would have to be increased to somewhere around U.K. levels if poverty and disease are to be reduced to levels seen in industrialized countries

The DOE study indicates the threshold is about 4,000 kWh per person which is somewhere between Spain and South Korea on the the graph. Of course, electricity is only a fraction of total energy use. The same DOE study indicates a ratio of total energy to electricity use of 7.5 should be used for standard-of-living calculations. Therefore we need approximately 30,000 kWh or 108 gigajoules per-person.

There is a lot to be gained through improving energy efficiency. Insulation, hybrid cars, smaller vehicles, public transit, high-density housing and so forth are all important pieces of the solution. However, the initial figures I have used here are less than half of U.S. energy consumption. So, efficiency gains are more likely to reign in high-consumption populations like U.S., Canada and Japan then they are to reduce the needs of developing countries. Besides, it costs energy to manufacture all of these efficiency improvements.

The current world population is estimated at 6.8 Billion. So, in order to eliminate poverty, increase freedom and improve the human condition we need approximately 734 exajoules (734 * 10^18 J) of net energy production per year in addition to massive improvements in energy efficiency. In 2008, worldwide net energy production was 474 exajoules. Given that population continues to grow, we should be seeking to more than double worldwide energy production while still seeking to improve energy efficiency.

If energy production is increased at the cost of environmental damage, we'll miss the goal. Clean air, clear water and wild spaces are just as important to quality of life as good health, nutritious food and a comfortable home. In future posts I'll look into where that energy might come from.

Other posts in this series:
Increasing Energy Production
Energy: The Future is Nuclear

14 April 2010

OfThat is Back!

After being down for about 45 days (and a posting hiatus before then) "Of That" is back -- this time hosted by Google's Blogger.

What Happened to Azure?

I originally started hosting on Microsoft's Azure. During the technology preview stage, hosting was free. I was also evaluating it as a platform for future products and based on the early information I could find, it appeared that the cost to host on Azure following it's full release would be modest. However, once the product released I found that the cost was going to be prohibitive.

I'll write more about Azure in a future post. For now, it's sufficient to say that it should grow to be a good platform for enterprise apps and possibly hosted services but it's not appropriate for small-scale things like my blog. There are things they could do to fix that weakness but I don't know if it's a priority for Microsoft.

What About BlogEngine?

I chose BlogEngine because it's a solid solution written in C# on ASP.Net -- a platform I'm familiar with. I wanted the ability to customize more than just the appearance of the blog and I have plans to launch active widgets as tools and experiments. However, it took me hours of programming to adapt BlogEngine to Azure and there was a lot more that I wanted to do. This all took away from any time spent on the widgets and experiments themselves. My new strategy is to let Google/Blogger worry about the blogging side. I'm confident that their available customizations and APIs will be sufficient to let me integrate my stuff.

Why Blogger?

Why did I choose blogger instead of WordPress or TypePad or <insert your favorite here>? First, because hosting is free even when using a custom domain name. Second, because Google allows monetization by placing ads on my site if I ever choose to do so (not yet). Third, because it's simple and straightforward. Sure, it doesn't have all of the features that some other platforms carry but it has all of the features that I need. By avoiding unnecessary bells and whistles I gain ease of use.

Shortly I'll write about my experience in transferring my existing posts from BlogEngine to here.

18 November 2009

Comments!?

In recent days the first comments have started to appear on this blog. It's taking a while to generate an audience so I was excited to see this. However, since the BlogEngine software I use allows commenters to include a link back to their own website I soon realized that most of the comments are generic compliments with no factual contribution. Most are there as an excuse to post links back to other sites.

In short, I've become an inadvertent participant in Link Farming. So far, I haven't posted a policy about acceptable comments. Despite being "used" to some extent I still am new enough to this to be flattered by any comments on my site even if they are motivated by other purposes. Therefore, I don't plan delete them... at least for now.

However, in anticipation of future problems, I suppose I should have some sort of acceptable comment policy. For now it's this: My policy is capricious and arbitrary. In other words, I reserve the right to delete any comment without explanation. In practice, however, the sorts of things I won't tolerate are foul or abusive language. The sorts of comments I prefer are factual contributions. I also appreciate opinion contributions so long as the reasoning behind your opinions are explained.

02 November 2009

Hacking the Vote

It's Election Day -- albeit an off year election. In Provo we are electing a new mayor and several members of the city council. I've heard and made the argument that local elections like this are actually more important because local officials have a greater effect on our personal lives that those in faraway Washington. Unfortunately I think that's no longer the case.

But I digress.

The subject of this blog entry is election technology. Like many municipalities we have changed to a computerized "Direct Entry" voting system in which the voter enters his or her votes into a touch-screen device. Despite being a technophile, I have serious misgivings with these systems.

To be sure, electronic voting makes tallying the vote quick and easy. My concern is that no matter how secure you make these systems, it remains possible that the vote could be manipulated without leaving any evidence. The computer scientists on Freedom to Tinker have been involved in several reviews of voting system software. They've found numerous security flaws as well as cases of accidental under and over-voting. Even if the flaws were fixed, the systems would be insecure without proper security procedures.

Recently, Sequoia Voting Systems announced that they will be publishing the source code to their voting systems. This is a very important step as it will allow independent reviews to detect and correct security flaws. However, while this makes it more difficult to manipulate the vote it can't prevent it entirely. Even with perfect software, there are other ways to manipulate the data as it passes through the system. And published source doesn't reduce the indetectibility of the manipulation if it does happen.

Another serious problem is attempting to determine voter intent, especially in the case of a recount. Despite dangling chads and other obstacles, at least the recounters in the 2000 U.S. Presidential Election had physical evidence of voter's actions. With direct-entry systems, little physical evidence is preserved. The better systems, like those used in Utah, include a paper tape printer. The voter is expected to verify his or her vote on the printed tape before it is finalized but I suspect that many voters don't bother or don't understand the importance of verification. And for paper verification to work, there need to be random tests where human counters check to make sure the tapes match the recorded vote.

More serious, in my opinion, is that computer security issues aren't intuitive to those not trained in the subject. Despite good intentions (and some training) volunteer election judges can be oblivious to serious security issues simply because they don't know what to look for.
For these reasons, I favor Optical Scan balloting systems. These systems use paper ballots that are marked by hand. For efficiency, they are rapidly counted by an optical scanner.To be sure, optical scan systems remain vunerable to ballot stuffing, voter intimidation and other attempts to manipulate the vote. But these known problems are intuitive to election judges and there are good procedures that can be used to mediate the problems. When using optical scan ballots, paper-only security systems can be augmented by electronic security systems like digital serial signatures that ensure only authorized ballots are cast and that each ballot is only counted once.

Optical scanners are just as vulnerable to computer security issues as direct entry systems. Because of this, published source remains an important security measure. Another verification is random hand-counts to see that optical scans match the counts made by human judges.

28 October 2009

Quote of the Day: Steve Forbes

Steve Forbes"People say, 'Well, when you make it, you should give back.' ... But 'give back' sounds like you took something that didn't belong to you."
   -- Steve Forbes

23 October 2009

A Brief History of the Crash of 2008

There are a number of contributing factors to our present financial crisis but the biggest issue (and the catalyst that set it off) is the collapse of the financial derivatives market. My friend, Paul B. Allen, has started Crashopedia.com to document the causes of the crash in detail but it can get pretty thick. Here’s my simplified summary of what happened:
1970GNMA "Ginny Mae" issues the first Mortgage Backed Security (MBS). A MBS is a way of collecting money to lend out in the form of mortgages. Bonds in the MBS are sold and the resulting cash is invested into a pool of mortgages. FNMA "Fannie Mae" and FHLMC "Freddie Mac" follow suit.
1977Under pressure from the Community Reinvestment Act banks and other mortgage vendors begin issuing high-risk (also known as sub-prime) mortgages with correspondingly higher interest rates.
Early 1980sIncentivized by the government and attracted by the high interest rates of these mortgages, bankers begin seeking a way to attract capital to invest in high-risk mortgages. Unfortunately, the market for high-risk investments is relatively small and interest rates are high.
1983Bankers invent the Collateralized Mortgage Obligation (CMO) which is a type of MBS in which shares are divided into risk "tranches." The idea is that if a lot of high-risk mortgages are pooled together the risk is reduced because only a fraction are likely to default. Risk can be further reduced for some investors by dividing the bond pool – decreasing the risk for premium tranches and increasing it in lower tranches. Bond rating organizations like Standard & Poors agree with this theory and offer high ratings.
1987Realizing that splitting risk into packages can work for more than just mortgages, bankers invent the Collateralized Debt Obligation (CDO) which is just like a CMO except that it may be backed by corporate bonds, commercial paper or other kinds of loans.
Early 1990sIn pursuit of ever higher interest rates (from higher-risk mortgages) but having trouble placing the higher-risk tranches of CMOs and CDOs, bankers find ways of enhancing the ratings of these tranches by using Credit Default Swaps. This is just a fancy name for an insurance policy. The bankers pay an insurance premium and the insurer pays up if the underlying asset (mortgage or bond) fails to make payments. Bankers can afford the insurance premiums because they collect more interest from the high-risk loan than they have to pay to the low-risk bond. AIG becomes one of the leading insurers of these obligations.
1990s and 2000sBankers come up with all kinds of new derivative instruments such as CDOs that invest in other CDOs (CDO squared), Single-Tranche CDOs in which insurance is used to raise the rating of the entire package, Strips, REMICs, PACs, Floaters and more. The tantalizing returns of these investments cause people to ignore Warren Buffet’s advice to invest only in things you understand.
2001Driven largely by growth in financial derivatives, the Financial sector surpasses Information Technology to become the largest sector in the S&P 500 as measured by market capitalization.
2002-2005Continuation of the longest sustained growth period in U.S. history masks the real problem with derivative instruments. That problem is that an overall decline in the housing sector or in the economy as a whole would cause simultaneous defaults – something that diversity and insurance don’t account for. Unencumbered by hidden risks, derivatives continue to offer stable income to investors and while enriching the investment banks that handle them.
2006The housing bubble bursts. Due to the ease of obtaining mortgages and the recent history of good real estate performance, a great deal of speculative building occurred in the early 2000s. By 2006 there was a surplus of homes in key markets like the West Coast, the Southwest, the Northeast Corridor and Florida. A mild recession at the time coincided with interest rate increases on Adjustable Rate Mortgages. The result was a wave of mortgage defaults.
2007The mortgage crisis cascades into the whole economy. Rumors grow that we may be in for a recession. Mortgages become increasingly difficult to get as investors pull out of the mortgage market.
2008Derivatives turn out to much riskier than their ratings indicated. AIG becomes insolvent as large numbers of CDOs default and they are required to pay up. Only a bailout by the Federal Reserve prevents it from going under. Credit markets freeze because bankers can no longer reliably determine the value of derivates which now account for an enormous part of the financial market. A new tern, Toxic Asset, is used to describe these because not only can they not be valued but neither can any institution that owns a substantial portfolio of them. Hundreds of banks with large portfolios of toxic assets fail and are taken over by the FDIC. The First Bailout Act including the Troubled Asset Relief Program is passed allowing the Treasury and the buy up toxic assets in an effort to relieve the credit markets.
2009As of this writing, the bailouts have had little success except to protect the profits made by irresponsible financiers. Credit markets are still extremely tight, the country is in a full recession, and unemployment is approaching 10% with certain markets well into double digits. Despite this, Congress and the White House have focused efforts on Healthcare Reform rather than considering regulations that might prevent irresponsible use of financial derivatives in the future.
Missing from this history are all of the forewarnings. For example, the General Accounting Office warned in 1994 that regulation of the market was warranted. Congress held hearings on the subject multiple times in the 1990s and 2000s and Warren Buffet famously wrote in 2002 that derivatives are "time bombs." There were many opportunities to prevent the train wreck before it happened. Unfortunately, the financial lobby was strong enough to prevent any meaningful reform.

I have some thoughts on how reform can be achieved without government intervention but those will have to wait for a future blog post. Meanwhile, this is yet another example of how government seems to be immune to forewarning. Action, if taken at all, occurs after the crash.

19 October 2009

Business Concept - A Virtual Secure Network

Introduction

This is the first in a series. Over the years I have come up with dozens of new business ideas. Some fraction of those dozens are viable and quite a number of them have appeared – though I haven’t been involved in most cases. This has taught me several things. My biggest lesson is that if I have a good idea, most likely someone else has that same idea and if I don’t pursue it, someone else is likely to do so.

Despite this long-known lesson my typical approach to a good idea has been to speak little of each idea in hopes that I may someday have a chance to make money from it. This, of course, hasn’t happened except in a few cases. I’m ready to challenge that strategy. Henceforth I’m taking an Abundance approach to new ideas. So long as it doesn’t compromise my obligations to my current employers, I intend to share my best ideas and simply see what grows.

Of course, if you get serious about pursuing one of these ideas we would both benefit if you were to contact me. I’ve given a lot more thought to these ideas than I can fit in a simple blog post.

A Virtual Secure Network

Most people are familiar with Virtual Private Networking (VPN). In a nutshell, a VPN allows you to connect your computer over the Internet to a private network. Usually this is used by businesspeople to connect to their office network and access private resources. It’s also used to interconnect networks between branch offices without the cost of dedicated private lines. Data that passes over the internet is encrypted to prevent eavesdropping. It may be argued that VPN is a misnomer since what you really have is a virtual connection over the internet to an actual network back at the home office.

I propose a true Virtual Private Network that would allow my laptop, my home desktop and my wife’s computer to all communicate regardless of where they are located on the internet. This would enable secure file sharing, Remote Desktop, Remote Assistance and a host of other things to work conveniently without worrying about firewall traversal, routing and other things. It would also use encryption to protect such communications from external scrutiny. To distinguish this from existing Virtual Private Networks and to emphasize the built-in security features I call I call it a Virtual Secure Network or VSN.

The biggest problem with any virtual networking protocol (VPN or VSN) is getting through the firewall. Most firewalls and routers will allow connections to originate inside the firewall but not from the outside. For example, my desktop PC at home can connect to Google.com but Google can’t contact my desktop because the connection is blocked by my home firewall/router. Specialized protocols such as UPnP NAT Traversal and Teredo have been introduced to fix this problem but adoption is limited.

A couple of years ago a colleague pointed me at this paper: Peer-to-peer Communication Across Network Address Translators by Bryan Ford, Pyda Srisuresh and Dan Kegel. It introduces a method of Hole Punching that opens TCP and UDP communication through a majority of firewalls including NAT firewalls. The system requires a publicly available server to coordinate the connection between two computers but once that connection is made, the individual computers are able to connect directly so the bandwidth demands on the public server are modest. This is the primary method that Skype and as those who have used Skype know, it simply works without any special network configuration.The Internet Engineering Task Force has worked on standardizing the similar methods to those proposed by Ford et. al. The original draft proposal is in RFC 3489 and an update is in RFC 5389.

I propose creating a virtual network adapter driver similar to those used for VPN connections. The virtual adaptor would use the real network adapter in a computer to connect with a public server on the internet and register the computer’s availability and the IP address of its firewall. Other computers in the same VSN could connect to that public server to discover the necessary information to broker a direct, encrypted connection.

From the user’s perspective, it would appear as if all trusted computers in his/her VSN are immediately available and things like Remote Desktop, Remote Assistance, File Sharing, Printer Sharing and the like would "just work" like Skype.

From a business model perspective it’s convenient that a public server is required to set up the connections but the server isn’t involved in the actual transmission of the data. This means that a company could set up the public server and charge a modest subscription fee without the bandwidth cost of actually relaying the traffic. Even if IPv6 and Teredo become popular, the VSN would retain security advantages that preserve the business model.