Home / Crypto Currency / ‘Rehypothecation’: Inside the Wall Street Practice that Could Ruin Bitcoin

‘Rehypothecation’: Inside the Wall Street Practice that Could Ruin Bitcoin

bitcoin fork crash


Note: This is part 3 in a multi-part article series exploring rehypothecation and commingling in bitcoin and other cryptocurrency markets. Part 1 and part 2 are interviews with Caitlin Long and parts 3 and 4 ask the question, “How did we get to a place that where laws look like this?”

In order to understand why Wall Street veteran and cryptocurrency advocate Caitlin Long thinks that “rehypothecation” and “commingling” are going to be much-discussed topics in the cryptocurrency industry, we must understand Long’s background. The best way to do this is to look at rehypothecation and commingling in the context that Long did — traditional securities — and to understand that we need to explore where the modern broker-dealer collateralized by the repo (repurchase) market and asset-backed commercial paper (ABCP).

At the beginning of our conversation, Long says that everyone’s “backgrounds bring them to who they are today and bring them the knowledge base for recognizing trends.” In Long’s case, this is definitely true and well explains why she is blowing the whistle on the wave of counterparty risk associated with Wall Street’s entrance into cryptocurrencies if they bring their current settlement practices with them. Beyond uncovering the introduction of risks from Wall Street’s, and particularly Intercontinental Exchange‘s (ICE) interaction with cryptocurrency, Long explains issues and costs associated with Wall Street’s current system.

This article reviews the risks and inefficiencies long brings up and uncovers how they came about. During my interview with Long, several questions continuously came to mind, but the foremost was, “How did we get here?” In order to adequately understand Long’s concerns, a (rather long) primer on the market’s evolution into the inefficient behemoth that it is today is necessary.

How We Got Here

The New York Stock Exchange was not always the towering behemoth it is today. Long before its market capitalization hit $21 trillion (larger than the U.S. GDP of $18 trillion), before September 11th, two world wars, the Crash of 1929 and the Great Depression it led to, before even the SEC, 24 men stood around a buttonwood tree outside of 68 Wall Street in New York City in 1792 and signed an agreement to give each other preference in the trading of securities. Revolutionary war bonds and stock from Alexander Hamilton’s newly created bank dominated the “market,” and as the exchange grew, the number of traders grew, and the men moved around to bigger and bigger buildings until 1865 when the current NYSE building was adopted. In the decades immediately preceding the turn of the 20th century, the NYSE stood out for its relative primitiveness, especially with its daily trading volume.

The most jarring difference between the NYSE and its contemporaries in 1890 was its lack of a clearinghouse. A clearinghouse is an institution designed to settle transactions between a network of buyers and sellers. Those of us living in the 21st century might ask, “Why not just settle at the time of the transaction? I’ll send money, and you send the shares, and the transaction will be settled in real time.” This type of settlement was not possible before the digital system. Much like a private key today means a claim to the actual bitcoin associated with the wallet, paper certificates were claims in a company (not just representative of a claim to a company owned by an individual). As such, whenever a trade was made shares were delivered to the firm that bought them in exchange for money by a specified time the next day.

The Need for Centralized Clearing

Prior to the introduction of multilateral netting on the New York Stock Exchange clearing was done on a “bilateral basis.” This means that brokers were required to write checks and trade shares for every transaction at settlement time. This required much higher levels of liquidity than multilateral netting, which is shown below:

Figure A2 Bilateral and Multilateral Netting

Take a look at Scenario A. Each letter (A, B, C, and D) represent a brokerage firm in a trading day. The arrows represent securities trades between the firms. In the example above all shares of securities cost $1 and so for every arrow going from one firm to another n shares (where n is the number next to the arrow) are transferred to the firm where the arrow is pointing, and $n is paid to the firm transferring the shares (share prices are kept the same for simplicity of the example). Although the transactions are made in the order explained below, settlements all take place at 2:15 PM the next day and every firm must have adequate capital and shares to fulfill the trades it made the previous day, or it will “fail to deliver” and risk bankruptcy. What follows is Broker “A”‘s trades for the day:

  1. Broker A starts with 90 shares and $10
  2. Broker A sells 70 shares to broker C for $70 and now has 20 shares and $80
  3. Broker A buys 80 shares from Broker D and now has 100 shares and $0
  4. Broker A sells 100 shares to Broker B for $100 and has 0 shares

The next day at 2:15 it’s time to settle. Because all settlements happen simultaneously, Broker A now needs to give 100 shares to Broker B (of which it only has 90) and give $100 to Broker B (of which it has $0). Obviously, there’s a problem here. Broker A is unable to fulfill its obligations for the sale of shares it made because it’s waiting on other firms to deliver shares (at 2:15 pm) and it can’t afford to pay it’s cash obligations because it’s waiting on payment for the sale of shares itself.

This required all firms to procure massive loans from banks (typically with other stocks as collateral for loans in a process called hypothecation) every day to meet their obligations before being paid by other firms. This process was called “over certification” and led to several problems for banks and brokers alike.

First of all, there was the variability of the interest owed on these overnight loans. The rate varied dramatically and hit as high as 125 percent leading to the failure of several brokerage firms who could not afford to pay back the wildly inflated loans. Even more, concerning was that this increased “counterparty risk” and caused a phenomenon called “contagion,” a domino-like effect where the failure of one brokerage firm would lead to the failure of others. For instance, in the example above, if any other firm on the web could not pay and defaulted, every other firm on the web would default. In reality, brokerages are much more complicated than this simple trading web and have other assets to use as collateral for loans, but the complexity of these webs necessitated more and more loans and caused dozens of brokerage failures when panics hit.

This “contagion” lead to catastrophes like the Panic of 1873 wherein 57 brokerage firms failed and the president of Bank of the State of New York threatened that lending to brokers for certification would be cut off entirely unless overnight borrowing was curbed. The solution, which had already been implemented by other exchanges, came in the form of multilateral netting. At the end of every settlement period, a Central Clearing Party (CCP) would net the asset obligations owed by each firm to one another. In the example above, rather than owing 100 shares and $100, Broker A sells its 90 shares and receives $90, with no borrowing required. According to one study, after the introduction of CCP’s, for every $25 million in securities trades just $5 million was transferred, a dramatic improvement over the system of bilateral netting. Papers have even argued that the increased counter-party risk associated with bilateral netting on the NYSE prior to the introduction of a central clearing party resulted in a 0.24 percent premium over other exchanges because default risk was so high.

The Paperwork Crisis

NYSE settlement time over time

As the years wore on, and trading volume slowly climbed, the crisis that would loom in the late 60’s and early 70’s begin to rear its ugly head. While multilateral netting dramatically reduced the paperwork required of banks, it did not eliminate it entirely. At the end of the settlement period (2:15 pm the next day), shares and checks were still transferred, and back office clerks labored away at forms to make sure the transfers were in compliance with all pertinent regulations. So long as the trading volume remained relatively low, this was not an issue.

Trading volume climbed rapidly until 1929 when the stock market crashed and the United States — and the rest of the world — entered The Great Depression. The Great Depression did not see volume levels return to what they were even a few years earlier, with trading volumes in 1930 still nearly double what they were in 1926. Nonetheless, the crash had taken its toll on the securities industry, and it could not afford to simply hire its way out of the crisis. Clerks in the back offices of brokerages had to work harder and harder to keep up, and in 1933 settlement time was moved up to t+2 days. Settlement time increased several times for similar reasons, without much fanfare, until it hit t+4. The move to t+5 would not be so quiet.

Centralized clearing on the NYSE was the first step in the formation of the modern financial system. The development is significant because instead of exchanging shares for money directly, an intermediary who charged for their services was introduced. It’s important to remember that while this was a move away from decentralization, it was a necessary move because the technology for instant settlement had not yet evolved. At this point in our story brokers (and their customers) still hold their share certificates. Even if there is a party through which trades are settled after the settlement, one party would still walk away with stock certificates denoting ownership in the company.

This would all change 3/4 of a century later as a direct result of the success of the system. As the years wore on, increased advertising led to increased retail investment and the prospect of capital gains propelled by the increase in retail investment led to the entrance of the institutional players of the day into equities markets: insurance companies and pension funds, which according to Wyatt Wells, an expert on the period, had “traditionally put their money into bonds or real estate, started buying stocks in large quantities.” This, combined with the rise in popularity of mutual funds to such a degree that the number of mutual funds doubled in a matter of years, resulted in a dramatic increase in stock market volume and sent prices soaring.

This continued until the late 1960’s when the system hit “the paperwork crisis.” Wells says “certificates for more than 100 shares were rare.” He goes on to give the example of an investor who purchases 500 shares of a stock to receive five 100-share certificates. This doesn’t sound that bad until you realize that trading averaged over 12 million shares a day in 1968. While the introduction of the clearinghouse had reduced paperwork, no one at the turn of the 20th century could have anticipated this level of volume. The volume is even more astounding when you realize paperwork beyond the issuance of shares. Wells says that, “The purchase or sale of a security might require as many as 68 steps.” Another study found that brokers used an average of 33 different forms for a single stock transfer. The paperwork was unmanageable.

To cope with the crisis the NYSE’s member firms started massive hiring efforts. “Every week the New York Times contained 100 columns of ‘help wanted’ advertisements placed by securities firms,” Wells explains. In 18 months, the number of clerks increased from 22,000 to 28,000. Any level of hiring on this scale soaks up the talent pretty quickly, and clerks were described by many as incompetent, overworked (often working night shifts), and prone to mistakes. Millions of dollars were misplaced and firms went bankrupt under the pressure.

As more and more firms went under, the NYSE scrambled for a solution. Computer systems showed promise — some firms had been using them but with limited success. The machines were far more complicated to implement than anyone had foreseen, and programmers were in very short supply. Additionally, many firms were under severe capital constraints that were exacerbated by the additional hiring of clerks and could not afford expensive computers that proved even more expensive to implement. Even when they could, there were cash flow issues: clerks could be elastically scaled to meet capacity by means of hiring and firing. With a computer, once you bought it, there was no getting rid of it.

The Rise of the CCS

The SEC began an investigation into the problem and landed on two different solutions that could be used to solve the problem. The first model was to create a “decentralized network” that would link transfer agents and allow them to transfer “uncertificated shares” (or shares not physically represented by a physical stock certificate) on electronic order books. This system had several implementation issues that caused it to be pushed aside.

First of all, we’re talking about 1971, the same year that the first microprocessor was introduced: a 4-bit, 740 kHz processor with the ability to address only 640 bytes of RAM. Apple and Microsoft wouldn’t be founded for another five years, and Steve Jobs, Bill Gates, and Steve Wozniak were all midway through high school. The second reason is that this model would require all shares be dematerialized (which would make them uncertificated shares) and “that shareholders have a psychological aversion to giving up their paper.” This sounds downright stupid now, but in an era where people did not use digital devices from the time they woke up to the time they went to sleep, it was understandable.

The alternative was to “create a centralized depository in which share certificates would be kept in custody.” Under this model, paper certificates would be preserved and put into a central place (a literal sealed vault). The centralized model would issue a representative instrument. As Wells said, “It would register all securities it held under its own name and direct dividends, voting proxies, and the like to the brokerage houses, which could then send them on to customers.” While this system had issues, it was far less ambitious than the decentralized system. As David Donald wrote in The Rise and Effects of the Indirect Holding System: How Corporate America Ceded Its Shareholders:

“The mood at this time was very far from the limitless trust in technology of 1969 when Apollo 11 had landed on the moon, and the computerized NASDAQ project had been set in motion. Under such circumstances, it is not surprising that Congress selected a safe, low-tech solution that shut out any future risk.”

So the centralized solution was implemented. The implementation of the solution, as it happened, was already underway in the form of the “Central Certificate Service” (CCS). The CCS was designed to immobilize shares by holding them for brokerage firms. This meant that rather than physically moving securities around, the CCS would transfer ownership on their books. The system had several issues.

First of all, the software was incredibly buggy and prone to failures. Modern software development standards around testing had not yet been invented, and because of the expensiveness of hardware, there were very few failovers. On top of the software issues, the legal system made an assumption that was incompatible with the rise of share immobilization. The laws of all 50 states used physical possession of the share as the only thing that constituted ownership in a company. The system also did not allow the use of shares as collateral in most cases. These issues would plague the system for years to come but failed to hinder its expansion entirely.

Toward the DTCC

By 1971 it was thought that they would have succeeded in cutting the transfer of certificates by as much as 75 percent. Yet at its peak, the CSS transferred just 10,000 shares per day. This was not the case because there were significant glitches and the system stopped accepting new shares for a while. There were other issues: the program was voluntary, and many brokers did not wish to give up their paper. To combat these issues banks and exchanges worked together to form the Banking and Securities Industry Committee (BASIC), which resulted in the creation of the Depository Trust Company and a central depository in New York.

The system had come far later than it should have and left the securities industry with some serious scars. As David Donald explained, “over 100 brokerage firms either entered bankruptcy or were acquired by stronger competitors.” The damage was done, and the firms that came out of this crisis through mergers and cheap acquisitions would rule Wall Street going forward. The damage was so severe, and so deeply impacted the institutions that underlie modern Wall Street, that the aversion to change would last until the current day. More significantly, the immobilization of shares would be written into the laws that govern the SEC, setting an already immalleable system into stone. In 1975, the SEC was required by the Securities Acts Amendments of 1975 to “end the physical movement of securities in certificates.” These words entrenched a system that had originated as temporary emergency measure.

The Cracks Begin to Show

A year after the SEC passed the aforementioned 1975 Amendments, they released a study concerning the consistency of the new law with the Securities Exchange Act of 1934 (which created the SEC) and to investigate the effectiveness of communication between companies and their shareholders under the current system. The study reported that the new system “makes communications between issuers and their shareholders more circuitous.” The report went on to show the enormous costs, both of time and money associated with the new system. Donald points out, though, that the benchmark the SEC measured the new regulation against was not the “utopian solution” of a “certificateless society” but was far superior to the system “that led to the disappearance of over 100 brokerage firms.” What was missing from the 1976 report was that this system was temporary. On the contrary, the new system was lauded as “the foundation of a national system for the clearance and settlement of…transactions.”

This foundation began to show some serious cracks that needed repair fairly quickly. Shareholder communication prior to the introduction of intermediaries had been a simple affair: every stockholder was listed on a “stockholder list” with their name and contact information. Issuers needed only go through the list and send information to everyone on the list. This is how shareholder communication looked after the introduction of intermediaries:

As you can see, the indirect holding system vastly increased the cost and complexity of communication with shareholders. As a result of this complexity, companies like ADP (and later Broadridge who now controls over 98 percent of the U.S. market for proxy voting services) were formed to handle shareholder communication for issuers who deemed it too expensive, or simply too complicated, to deal with the system where they “play blindfolded” and “cannot know what lies beyond the next wall in the intermediary pyramid before making an inquiry.” The amount of cost associated with this model is fiscally irresponsible. In 2012 it was estimated that issuers pay $200 million a year to communicate with their own shareholders (excluding costs associated with mailing and printing). This is bothering issuers, with the SEC reporting that this was among “the most persistent concerns expressed” by issuers. This system has also rendered shareholder lists useless, with Cede & Co. (a subsidiary of the DTCC) being the only shareholder on hundreds of corporations.

Stay tuned four part four in CCN’s series on rehypothecation and Wall Street.

Featured Image from Shutterstock

Follow us on Telegram or subscribe to our newsletter here.
Join CCN's crypto community for $9.99 per month, click here.
Want exclusive analysis and crypto insights from Hacked.com? Click here.
Open Positions at CCN: Full Time and Part Time Journalists Wanted.
Read more

Check Also

Bitcoin is a ‘Dead Man Walking’, Claims Creative Planning CIO

Bitcoin is a “dead man walking” whose price will eventually plunge to zero. That’s the dire prediction of Peter Mallouk, the president and chief investment officer of Creative Planning Inc., a Kansas investment firm that manages $32 billion in assets. The avowed bitcoin skeptic said BTC bulls are delusional saps who don’t realize that their The post Bitcoin is a ‘Dead Man Walking’, Claims Creative Planning CIO appeared first on CCN

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Disclaimer: Trading in bitcoins or other digital currencies carries a high level of risk and can result in the total loss of the invested capital. theonlinetech.org does not provide investment advice, but only reflects its own opinion. Please ensure that if you trade or invest in bitcoins or other digital currencies (for example, investing in cloud mining services) you fully understand the risks involved! Please also note that some external links are affiliate links.