This past summer at Union Square Ventures, Max Heald joined us as our summer intern. During his 10 weeks at USV, he helped us better aggregate insights shared across our portfolio companies through data collection. As part of this process involved reading through company board decks, he came away with a few top takeaways from his project, which he recently posted about on his Medium account. Below, the full text of his original posting:
After graduating from college in June, I had the chance to spend three months at Union Square Ventures, helping with a project that created anonymous aggregate insights for USV portfolio companies by analyzing data across stages, customers, and industries. In order to not put a lot of extra reporting effort on our companies, we approached this project by drawing from data in existing board decks. This work afforded me the unique opportunity to familiarize myself with many of our portfolio companies’ board decks, and the communication styles of the leadership teams behind them.
I quickly found out that there is no one standard board deck for a USV company (and in fact, we’re proud of that), but I did observe a few commonalities among the most effective decks.
At a high level, the best decks accomplish three things:
Here are 6 ways I saw this done best:
There’s a difference between telling someone, “Our GM decreased by 8% last month”, and “Our GM fell by 8% because we ran a promotion which netted us a 20% bump in total clients.”
The best board decks anticipate questions in advance, and answer them with clear, concise data. My favorite example is a simple graph of total ARR versus a bar graph of ARR broken down by addition and subtraction of clients, and expansion and contraction of client spend. Our portfolio company eShares does an excellent job of this:
Another excellent example of effective data context is a 12-month trailing P&L. Most of the decks I read had monthly data for the most recent quarter, and projections for a few months out. I ran into a couple problems here: First, many companies’ board meetings happen infrequently enough where there are significant gaps in data. This makes it extremely difficult to understand a company’s financials in the context of their most recent fiscal year without making a data request to the company, something we generally like to avoid. In the case the data does overlap, however, it still requires piecing together information from multiple decks to get a decent view of the last year. As a recent college graduate / workaholic, I’m happy to do this. I imagine a board member of a venture-backed company would be less inclined.
For these reasons, the 12-Month Trailing P&L is perfect; not only is it an easily-updated table, but it’s a way to view the scope of a company’s financials over a substantial amount of time. Though things often change drastically from quarter to quarter, most investors want to see how the company has grown through multiple phases. And though projections are usually wrong, including them for the next quarter or two is usually worth doing, too.
Board meetings are typically the only time when all of the key stakeholders of a company are in the same room, so naturally, it makes sense to wring as much as possible from these discussions.
Some of the best decks I read do this by kicking off the deck (and thus, the meeting) with a board-only Q&A. This encourages board members to come in prepared to fire away to the CEO and their leadership team.
Structuring a deck this way offers an incredible method of ensuring preparation from the stakeholders of a company, and structuring meetings effectively. Give them the data, and let them loose. It also sets time for the outside stakeholders who know a company best to have an unfiltered discussion during which unexpected ideas may rise to the surface, before the team then uses the rest of the time to address their own pre-planned questions.
The best of these team-led sections of decks tend to focus on a few key issues, rather than providing comprehensive data streams; evaluating potential key hires, discussing a fledgling revenue stream, or a potential acquisition offer could all be strategic issues worth spending time on.
Great board decks balance gathering information on what the team needs help with, and providing critical updates the board needs to hear. Again, it’s about style, and leaders of a company will know what’s best. But by structuring decks to gather key input from the stakeholders in the room, teams can ensure they are putting board meeting time to its best use.
Board meetings are not pitches. As obvious as this is, there is no need to sugarcoat, talk around, or avoid data points which are less than ideal.
The most effective board decks I read were direct about what the teams could improve, and asked for specific feedback from the board on how they could accomplish it, or better yet, on their already-thought-out plan for accomplishing it. Board members are a brain trust for a company, and they know it better than anyone outside the team- if there is a place to be blunt about the challenges a company is facing, it’s with them.
Most companies track many statistics for internal use, but when it comes to two-hour board meetings, there are often a few key metrics which are the best indicators of the health of a business. As with discussion topics, narrowing in on these metrics can focus the conversation and give the board a broad understanding of the company’s situation quickly. In the interest of time and productive conversation, it’s usually better to go deep than broad.
Considering SMART metrics (Specific. Measurable. Achievable. Relevant. Timely.) here might be a useful litmus test for figuring out what to track. Whether it is gross margin, active users, and LTV, GMV and MRR, or even a statistic uniquely effective for the business, the best board decks bring out the key levers of the company for a more productive discussion.
Most decks I read tended put these up front, or worked them into their brief overview of company progress, as a way to kick off the discussion on a high note. The part I enjoyed most, though, was how many companies would include recognition for the non C-level employees who spearheaded these big wins for the company.
Especially for people who prefer “the right people knowing” of their contribution over more public, company-wide recognition, this is an excellent way to show appreciation for team members who excel.
Board decks are art. And art often tells us as much about the artist as it does about itself. I came across a consumer platform company’s deck which really popped; translucent spreadsheets and statistics against varying hues of neon. A security-focused company’s deck was a stoic black-and-white, the pre-set Powerpoint template unchanged in any way. Jobbatical’s deck, as you can see below, throws in just a bit of personality.
Each of these three decks conveys a different company culture. One focused on imbuing its office with as much liveliness as the platform they’re building. One so serious about security, it doesn’t have time for design. One playful, unorthodox, and a little weird, in a great way.
Every board meeting is a chance to for company leaders to communicate to their stakeholders what they have built. Half of that work is communicating the culture of the workplace they have created. And since these decks are meant to be read as well as presented, it’s crucial that culture is communicated through the style of the deck, and not just the presentation of it. The best teams know this, and use it as a way to demonstrate their leadership.
While they may not be built to be as flashy as a pitch deck, board decks can be an incredibly strategic and constructive communication tool for your business.
They are a chance to show your stakeholders how well you can synthesize the key levers of your business, discuss them efficiently and effectively, and communicate directly any problems that need to be tackled. They are an art, and a chance to show off the culture and team you have built.
And as an opportunity for a demonstration of effective leadership, to the stakeholders who are the most active supports of your company, they are well worth the effort.
Max recently graduated from Northwestern University, where he was an Agile coach and two-time founder. He is currently interested in venture and startup roles that allow him to meet and learn from great founders.”
Protocol Labs made a series of announcements earlier today including that Union Square Ventures made an equity investment in the company late last year. We are thrilled to be working with Juan Benet and his team and excited to be able to share some of our thinking here.
As most of you know all of us at Union Square Ventures believe in the decentralized, emergent, permissionless innovation that was so central to the vitality of the early Internet. Prior to the Internet, the media industry was dominated by a small number of companies that controlled access to in their respective mediums, print, television, radio, cable etc. It was the broad adoption of a set of open protocols, like TCP/IP, SMTP and HTTP, that allowed any creator on the planet to get to any consumer and unleashed the wave of innovation that led to the consumer Internet we know today. That vital innovation is threatened today by consolidation at the applications layer of the Internet. Publishers find themselves becoming commodity content suppliers in a sea of undifferentiated content in the Facebook news feed. Web sites see their fortunes upended by small changes in Google’s search algorithms. And manufacturers watch helplessly as sales dwindle when Amazon decides to source products directly in China and re-direct demand to their own products.
The source of this market power is control over the data we all contribute as we interact with these services online. The key to mitigating the market power of the web giants is open protocols further up the stack. If an open public communications network (the Internet) unlocked the distribution bottlenecks that characterized the media industry, an open public data layer is the key to and unleashing another wave of innovation. It is the mission of Protocol Labs to coordinate the efforts of a large and passionate community of open source contributors to create these protocols.
It is an audacious mission. As you move higher in the stack the complexity of the protocols is exponentially greater. Luckily, they are not starting from scratch. Juan Benet, the founder of Protocol Labs, is the creator of IPFS (the Interplanetary File System) an increasingly popular protocol that allows content on the web to be addressed directly instead of by reference to a file located on a specific server. This subtle but profound change means that a provably unique piece of content is no longer tied to a specific server but can exist anywhere there is a little surplus storage capacity on the web. Protocol Labs and everyone else working on open protocols today has another advantage that was not available to the creators of the original Internet protocols. They have blockchains.
Blockchain based crypto tokens have been have been described as the native business model of open source software. They have the promise of being able to fund the critical shared infrastructure of the information economy in a way that equity can not. Protocols are more valuable when they are open and shared broadly. But equity is most valuable if a company can extract monopoly profits from a resource they exclusively control. When a protocol incorporates an incentive in the form of a crypto token it can resolve this inherent contradiction.
In the next few weeks, Protocol Labs will be introducing Filecoin, a crypto-token to support the development of a next generation protocol that enables a decentralized data storage layer on top of IPFS. By funding this effort through the sale of a token rather than the sale of additional equity, they ensure that the creators and consumers of value in the storage network (the people who buy storage with tokens and the people who earn tokens by storing files for others) will benefit directly from the success of the network and the protocol that defines it. This happens because the protocol sets limits on the number of tokens that can ever be issued. Because the tokens are the currency in this marketplace for storage, as the protocol becomes more broadly adopted and the marketplace for storage grows, demand for the token increases, and the currency appreciates. So the tokens that investors purchase in a pre-sale to fund the engineering effort to build the protocol, the tokens people hold in their wallets in anticipation of buying storage and the tokens people earn by providing storage capacity all grow in value over time.
The bitcoin protocol demonstrated that it was possible to finance an enormous computing infrastructure - reportedly one with a hashing power greater than all the super-computers in the world - with an crypto-token. But it does so at a great cost. Securing the bitcoin blockchain could by 2020 consume as much electricity every year as all of Denmark. With Filecoin, Protocol labs, hopes to secure the network with useful work - work that has to get done anyway - storing files for people. Over the next few years, Protocol Labs plans to develop a series of protocols that could become the infrastructure of a more decentralized economy. By funding these efforts through sales of crypto tokens, they ensure that the economic value of the protocols is shared broadly. By designing systems that secure the network by doing useful work, they respect the limits of our natural resources.
But they have also made one more investment to further the development of this shared infrastructure, they have invested heavily in the legal design of the Filecoin token and the pre-sale process in the hopes of demonstrating that these offerings can be done responsibly in respected jurisdictions. We have already made the point that the pre-sale of a crypto-token is different than equity. It is also not a commodity, a currency or a futures contract. It is something new. As such, it does not fit neatly into any existing regulatory or legal framework. Many of the recent offerings of crypto-tokens have avoided the difficult task of fitting the round peg of crypto-tokens into the square hole of existing regulatory frameworks by raising money in a foundation based in Switzerland, and offering the token for sale in a jurisdiction like Malta or Singapore. This is a reasonable approach. The reality is that these offerings are inherently global. Pretty much anyone anywhere can participate, and the recent returns in the sector have caught the eye of investors around the world. In the near term, it benefits the existing holders of the token to have access to global demand in an unregulated offering, but not all of these offerings will end well. Protocol Labs is playing the long game. They believe that a pre-sale of a crypto-token is an important new funding mechanism that will support the creation of a rich ecosystem of protocols that decentralize the web, democratize access to services and spread the wealth created by networks beyond a narrow cohort of equity owners. To further that goal, they are working with an army of lawyers to create a mechanism that is defensible under U.S. law and regulation - one that can hopefully be a template for others who want to build infrastructure that lasts.
Protocol Labs is creating new infrastructure in an new way. We think their commitment to the re-decentralization of the web will lead to protocols that are powerful, broadly embraced and generative. Their approach to financing their work will spread the value that is created more broadly. Their commitment to investing in a legal approach that respects the current regulatory environment while not compromising on the promise of the new technology will be a foundation others can build on. We are thrilled to be along for the ride.”
The USV portfolio network consists of 67 active companies with over 7,000 employees across the US, Canada and Europe. We believe in using the power of networks to help our portfolio companies build better businesses through peer to peer learning, external network relationships, and shared resources. On average we host 60 portfolio events each year and, in 2017 we are on track to host nearly 80 in total.
Within the USV portfolio, we have several companies working in the healthcare space. Based on their feedback and recommendations we organized an intimate session with leaders from the USV portfolio, USV partners and staff, and external industry leaders. We held this discussion at our office and called it “Building the Healthcare Stack”.
This “Hacking Healthcare” event was the brainchild of USV CEOs who wanted to connect with other healthcare organizations and professionals that are spearheading initiatives and development within this industry. They wanted to debate high-level areas such as telemedicine as well as dive into granular challenges facing open source health data and patient care.
In USV fashion we organized this as an “unconference” style conversation and segmented the day into 4 high-level topics.
Our goal for this session was for participants to come away with thoughtful perspectives on a variety of areas. We chose specific, complex topics that would seed provocative and unfiltered discussions and I have chosen to highlight 3 of the top takeaways.
1. Healthcare companies are investing too much money in walled gardens
When thinking about medical facilities (hospitals, doctor’s offices, research centers, etc.) every single one of them controls their own data — what they collect, how it is stored, cleaned, and analyzed. There is no universal system that allows patient data to connect and talk to each other.
One reason why walled, healthcare gardens exist and in fact flourish is that there are no financial incentives to open them up. As a result, healthcare organizations only see the world through their own institutional myopia. This nearsighted mentality can have lasting effects on the staff, patient experience and culture. An example of a set standard are the Press Ganey scores which are calculated from post-appointment patient surveys. Some facilities use these scores to award bonuses which can cause tension and wrongfully placed incentives for the physician. This can also shift their priorities from treating someone as effectively as possible to focusing on personal, soft skills and excellent bedside manner. As we see more digital health companies emerging in the space, we may need to find ways to blend and ultimately open up some of these datasets.
2. How can we open source healthcare data?
In software, open source data is anything made publicly accessible for others to use or modify. But what does this mean when talking about personal data in healthcare, such as patient records and diagnoses?
One view in favor of open sourcing health data is that asynchronous medical care is absolutely required to improve the efficiency of the medical industry. In theory, this could allow a patient to visit any clinic in the world, see a physician, and get a diagnosis utilizing their existing data. This diagnosis could be based on the patient’s medical history as well as aggregated data from other doctors and facilities.
One point made against open source health data is that since clinical data is not being captured in structured data formats, there would need to be major, additional regulation around the input and “cleanliness” of the data points. It would also be extremely time consuming and costly to implement this open source database. Finally, if this data is open to anyone, then patients would of course have access as well. Is this in fact useful to you as an individual or rather is it too much clutter and detail to comprehend in a relevant way? If it is accessible to anyone, it could be incredibly susceptible to abuse and malpractice.
3. More data ≠ better data.
Some of the participants challenged the group on whether healthcare data should be considered “big data.” One opinion was that big data is dependent on the quality of the data, not quantity. In other words, if you feed bad data into the system, the AI and data output will produce bad results. Deep learning has struggled to work in healthcare as, contrary to other industries, there is not a lot of big data to be analyzed yet. Let’s take Facebook for example — if you think about facial recognition technology, Facebook services millions of people through their tagging feature. There is not that kind of system built yet for health systems.
Another crucial issue worth mentioning is that there is a tremendous amount of bias in datasets. When aggregating data in a machine learning context it is difficult to deduce evidence based conclusions. The methods used to analyze and collect data vary from one provider to another and every facility wants to use their data to help control their own standard of care. A suggested solution to this issue is to build a business model focused on aggregating data on a patient level. This would allow facilities to recognize the behaviors of individuals (habits, socializations, family complexity, etc.) and collect rich, unbiased data.
Staying in line with the idea that less data is better, what is the least amount of data that could be collected to achieve optimal results? For example, if a pregnant woman is asked if she previously had a premature birth (Yes or No response), based on her answer she could receive more targeted treatment and precautions to reduce complications and medical bills. Rather than attempting to tackle big healthcare data, one could focus on “small data for small outcomes”. This would result in more precise patient data through light-touch interactions which would lead to better facility / patient habits.
_ _ _
I wish I could say that we unraveled answers to some of these complex questions. Instead, the conversation between 30 industry leaders was rich, unfiltered and provocative — in our eyes, a success. Everyone was willing to share critical developments, milestones and roadblocks. Industry giants heard the voices of mighty, lean startups and vice versa. Arguments and compromise ensued, relationships were built and partnerships were seeded.
With some of the brightest minds in medicine, technology, insurance, non-profit, and academia, some leaders around the room had been confronting these issues for decades — others for less than a year. Out of all of those brilliant minds, not one person could pinpoint the solution to one of these healthcare challenges. Personally, I left with hope that big strides are being made in this industry. On behalf of USV, our hope is that we can continue to facilitate open and transparent conversations like this across the country and world.”
Everyone knows the big Web 2.0 companies use hundreds of data points to determine which ad we might prefer. And yet — in the deathmatch against disease, we reduce human health to single variables.
Granted, this has partially been due to immature technology and infrastructure; after all, an assembly line of PhDs can only annotate the genome so quickly. There is also a hard limit on a human’s ability to find patterns within the noise.
In the last couple of years however, a few trends have reshaped the landscape for startups working at the intersection of computer science and biology:
1) the hardware layer of the genomics stack has been commoditized,
2) the cost of genomic sequencing has fallen below the threshold required for routine reads,
3) data storage is effectively free, and
4) sophisticated computational tools, including deep learning, have matured, allowing us to apply strategies that were not possible before
Once in a while, there is an inflection point that completely changes the rules of the game. We saw this in the early 2000s, for example, when suddenly you didn’t need a big check to build your own servers and infrastructure, just to get a website up and running.
What this shift enables, is a new generation of biotechnology companies very distinct from its predecessors, with characteristics not unlike the software and machine learning companies we are familiar with.
The characteristics that make software startups so appealing — that you can test your idea cheaply, that you can de-risk early, that you can scale quickly, etc — will be found in this new generation of biology companies also. In fact, many of these startups should really be thought of as machine learning/software companies with domain knowledge in biology. Just as we saw an explosion of web startups running many experiments at a low cost in the mid 2000s, we expect to see a similar phenomenon in the biology space.
And clearly genomics is a big-data problem — arguably the biggest today. The thing is, most people think of the genome as a static tell-all dataset. In reality, even your somatic dna changes at an astonishing rate; in fact, we can predict your age, within around a 5 year confidence interval, from your genome. That would not be possible if your genome was static. So we need to reframe the genome as a dynamic real-time data stream of what is happening in the body. Then of course, we also need to couple longitudinal genomic datasets with time series biomarker data before we can use our new tools to understand human health a little better.
We have been excited to meet teams that are fully leveraging the promise of this new era. A couple weeks ago, for example, we met cancer diagnostic startup Freenome, which uses cell-free dna from liquid biopsies to detect cancer at an early stage. If that sounds scary, at a very high level, it is just a machine learning categorization algorithm. What is exciting is that they have essentially taken an agnostic approach to the problem. Healthcare is a notoriously slow-moving industry, but imagine that in the future, new findings will simply be incorporated through a software update.
Beyond disease diagnosis, we have seen startups working in agriculture genomics, drug response, and even designing a new genomic programming language, that have all captivated our imagination.
It will be tempting at times to dismiss these startups as naive; after all, many of them are tackling highly complex problems that generations of scientists have given blood sweat and tears to, only to make tiny contributions. And indeed, there are many technical and commercial bottlenecks we have yet to overcome (next post). However, we have seen impressive real-world results, and we are excited about what is to come.”
I'm Jennifer, one of the new analysts on the Investment Team at Union Square Ventures!
Before joining USV, I studied statistics and computational biology at Harvard, and then worked at a hedge fund after graduation. I'm particularly interested in new financial/banking paradigms, blockchain + decentralized networks, and genomics.
Growing up, many people around me were entrepreneurs. I was completely fascinated by the idea that you could make your own path, especially in a culture where adherence to the rules was glorified. And whether they were successful or tremendously unsuccessful, they all tried to leverage technology to make the world a better place — and I really admired that.
I'm very excited to get to know all of you — please say hello at @jml_campbell or email me at jennifer AT usv DOT com”