Setting sail to decarbonisation

Shipping touches pervasively but unobtrusively on every aspect of our daily lives; from the clothes we wear, to the food we eat, to the goods we order online. But rarely do we think about the negative environmental ramifications caused by it. These environmental impacts include air pollution, water pollution, noise, and oil pollution. Greenhouse gas emissions from shipping currently represent 2.6% of total global emissions, equivalent to those generated by South Korea. The International Maritime Organisation (IMO) estimates that carbon dioxide emissions from shipping are expected to rise by between 50-250% by 2050 if no action is taken.

There are several drivers contributing to increased efforts to decarbonise. The maritime industry is facing a twinned challenge of a global rise in fuel prices combined with tighter environmental regulations. The IMO announced in 2018 the objective to “reduce the total annual GHG emissions by at least 50% by 2050 compared to 2008, while, at the same time, pursuing efforts towards phasing them out entirely”. Additionally, the IMO’s global sulphur cap comes into play from 2020, substantially lowering the current 3.5% limit to 0.5% and enforcing cleaner shipping. Consumer demands, reputation concerns, and pressures from NGOs and investors are also increasing the demand for greener shipping.

Decarbonisation is the key challenge for this industry; however, the design, operation, and maintenance of shipping is built to suit the fossil fuel ‘paradigm’. Deployment of all currently known technologies could make it possible to completely decarbonise by 2035. But how are we going to get there? There is no silver bullet technology that can make the transition easy and effortless. Instead, a wide variety of technologies are needed. Innovation is necessary and vital.

Whilst shipping is the least carbon intensive way to move freight, the industry is highly reliant on outdated technology. This presents a huge energy efficiency opportunity for ships, start-ups and investors. Zero-Emission vessels (ZEVs) are needed in order to meet the IMO’s targets and contribute to meeting the goal of the Paris agreement. There are 3 broad solution areas through which decarbonisation can be achieved; technological, operational, and alternative fuels/ energy. The largest emission reductions are likely to come from alternative fuels/ energy.

Technological solutions involve improving the weight and design of ships, reducing friction, and energy recovery e.g. via propeller upgrades or heat recovery. Potential fuel savings arising from air lubrication and hull surface technologies alone could be 2-9%. For example, in this space, graphene is being innovatively used to reduce biofouling, increase the longevity of boat hulls, and decrease friction. Furthermore, many of these solutions are already available on the market and can be retrofitted.

Operational measures involve speed, ship-port interfaces, ship size, and onshore power. Multiple start-ups are specialising in this area; creating digital twins (such as We4Sea), data driven cloud platforms and using AI for predictive analytics to optimize operational performance (such as nauticAi).

Alternative ZEV technologies include ammonia fuel cells, ammonia + internal combustion engine (ICE), biofuel, electric batteries, hybrid hydrogen, hydrogen fuel cells, and hydrogen + ICE. It is crucial to ensure that these fuels are not simply moving the GHG problem upstream, as there may be emissions that arise through their production. However, future CO₂ emission reductions from certain alternative fuels could be 100% if produced by renewable energy sources. Not all of the alternative fuels have reached market maturity, and most are still in a research and development phase. There are also issues regarding safety, cost, availability, and sustainability. It is likely, however, that the costs could all reduce significantly in the future. Two of the most likely routes to shipping decarbonisation come from the use of biodiesel and the use of ammonia, based on zero-carbon hydrogen. In the near term, novel wing sail systems, such as those developed by Bound4Blue, are already making a splash with serious potential to reduce fuel consumption.

The maritime industry is not without its risks for start-ups. There are high barriers to entry, and significant levels of skills, experience, knowledge and capital are essential. Information asymmetry, split incentives, and the fragmented nature of the industry are not easy obstacles to overcome. Climate change is opening up the Arctic, international trade will continue to grow and the demand for shipping will increase. The scale and value of this opportunity must not be ignored or hidden by ‘sea blindness’. This is an exciting time for innovative technologies companies who can play a critical and key role in decarbonisation.

From Lab to First Adopters

When it comes to finding Product-market Fit (PMF), entrepreneurial vision is helpful but insufficient. Landing on the moon may be the vision but it requires precise and completely accurate calculations to actually get there.

To increase the probability of finding PMF and to accelerate the process requires the systematic and thorough application of a particular toolset in a stage specific way. Those pioneering tools are: detailed hypothesis building, market engagement and application discovery, analysis and rapid iteration, and validation.

And, of course, the crowning evidence of PMF for product companies is that first set of deals that proves your ability to generate significant revenues at a high gross margin by solving a high value challenge either in a way that no other product can or in a way that is much more effective and efficient. The right set of ‘first deals’ demonstrates market acceptance and pull, and sets in motion a pattern of accelerating revenue capture (traction).

For broader platform companies, the ‘first deal’ challenge involves working with a broader ecosystem to identify applications and build products around your platform that achieve market acceptance. While the goals are the same as with the product company (see above ), the difference here is that there are potentially several different applications that we can apply the technology to. The skill lies in choosing the right initial applications that can have a multiplier effect with regard to: revenue generation; industry acceptance; and technology scaling.

‘First deals’ are different in nature and require a different pioneering skillset than those that follow in the growth stage. To generalise, they are harder to win, demand greater intensity, consume more attention, require more face-time with the ‘customer’, take longer, need a broader more cross-functional consensus within the ‘customer’ organisation, and are substantially more valuable than those that follow.

Whereas with ‘known’ products, resistance is likely to emerge early, curiosity for ‘the new’ means issues are likely to emerge later. For the venture organisation, where the mis-allocation of resource can be an existential threat, a long but ultimately fruitless engagement is deeply problematic. Curiosity is a powerful lever for stimulating engagement but also a trap sprung by the seductive charms of early interest. The challenge is to convert curiosity into opportunity early by creating a stage gate that gives the counterparty a clear choice between disengagement or a meaningful commitment that signals interest has been transmuted into an opportunity. All too often the issue lies in the lack of leverage that a technology company can bring to bear to ensure adherence to a stage-gated process. It is of course the evidenced and transparent promise of the technology that should support a more symmetrical interaction. Once established, the best way to ensure leverage (this is most applicable to platform technologies) is to have multiple competitive companies in the same industry all in the same process which creates an urgency to progress and conclude a deal within a desired timeframe with the carrot (should one be necessary) being some form of preferential access to technology which moves the competitive advantage needle.

At least from the perspective of the technology company, ‘first deals’ are based on no direct precedent. Practice is being formed and enacted for the first time. The execution capability is embryonic. Experience may accelerate the process when wisely applied but it may also hinder progress by adhering to modes of action applicable to different contexts. Generalised knowledge can be useful but is trumped by context specific insight. The goal for product companies as they move from ‘technology visionaries’ and ‘early adopters’ (who will adopt largely on the technology’s potential) to ‘followers’ is to evolve a practiced capability built on: fast learning and systematic iteration to distil what works; a creative process mindset; and extraordinary maniacal attention to ‘customer’ detail.

At each stage, a fit-for-purpose process must be created, tooled up, and optimised. Pooling expertise early into specialist jobs (embryonic functions) is important and is a precursor to scaling. One of the huge advantages of following this type of approach to designing and developing process, whether you are pioneering a product or a platform application, is that it quickly highlights the really critical steps in the process and what is needed to engineer successful outcomes. Those critical steps are nearly always conversations. The end goal is a series of repeatable actions – the smartest and most efficient way executing deals in these formative stages of the product’s lifecycle.

What is critical about building the execution capability is that it is foundational. It sets down the templates for others to follow. A great house cannot be built upon poorly built foundations. Starting over is a difficult and expensive job. Bad habits and poorly defined sub-optimal practices become embedded. A restart will almost certainly require the recruitment of new people. Success is ultimately only measured by results. There may be many ways to tackle a challenge but it pays to select the best way.

Energy storage: generation’s forgotten twin

In recent years, the conversation around renewable energy sources has grown broader and louder. Although wind, solar, and their lesser-known cousins do not (yet) represent a majority of energy generation in the vast majority of countries, they form an increasingly significant part of the grid’s energy mix. Indeed, in 2017 – and for the first time in its history – Britain generated more of its electricity from renewable and nuclear sources than from gas and coal.

Great news. Onwards and upwards! But there’s a small hitch… Renewable energy is famously intermittent. The wind blows when it feels like it and, to the ire of many British beachgoers, the sun shines any time other than when you want it to. Ok, some renewable energy sources such as hydro are more predictable but let’s focus on the intermittent side of things for now.

Because of our historic dependence on ‘predictable’ conventional generation, we often overlook a critical component of energy provision in the next 10… 20… 100 years: storage. All too often, energy generation and storage are unhelpfully divorced from one another. Yet, we will never successfully achieve a renewable future without realising an equivalent investment in, and evolution of energy storage technologies. The yang to generation’s yin, if you will.

It is only really since the advent of Tesla that the world has started to think seriously about energy storage (good Tesla). We have been talking solar panels and wind turbines for several decades. Storage has some catching up to do. Indeed, it is the automotive industry which is really driving the flow of investment into energy storage technologies. This is also why many people are limited to thinking that ‘storage equals batteries’ (bad Tesla), predominantly lithium-ion. Yes, I know we use the same chemistry in the batteries that power our phones, laptops, etc. but this isn’t the coalface of chemical battery innovation.

The potential problem with this battery-focused view is that it will not be the best storage technology in all situations. Lithium-ion isn’t even that good: it doesn’t store that much energy, it’s expensive, and it’s not entirely safe. You can pick figurative holes in all batteries, all technology types, but my fundamental point is that this is not a ‘one size fits all situation’.

For example, a remote monitoring sensor requires short, large bursts of power that might be provided by a supercapacitor. Electric vehicles of the future might run on fuel cells, instead of batteries. And grid-scale renewable generation will need to be paired with grid-scale storage which could take the form of giant flywheels, compressed air energy storage in vast underground caves, or something we simply haven’t invented yet.

Successful innovation leaders will remain agnostic as to what future storage solutions will be required by each industry and application. My point is that, by focusing on batteries, we may limit the development potential of other technologies, some of which could be essential to our energy future.

Secondly, and to go back to where I started, we will need a myriad of solutions to support the energy transition to a cleaner, renewable grid. Unless we re-establish the critical link between storage and generation, innovation in the former will continue to lag behind. In practical terms, we will produce all the energy that we could possibly want from the sun and the wind, but it will have nowhere to go.

The allure of the ‘data is the new oil’ analogy

The commodities market is no stranger to data; a quick Google search will lead to streams of data showing price fluctuations and percentage deltas. Oil is back up to $70 a barrel and lithium is riding high on the projected growth of batteries and electric vehicles. One thing, however, that is not publicly traded on the commodities market, is data itself. A myriad of recent articles have hailed data as the new oil- the most valuable commodity over the last century. However, while the comparison of data and oil has some use, to label data as a commodity like oil is a misnomer.

The comparison is an attractive one. Data is seen as the fuel for our modern information economy. It is extracted in a raw and crude form and refined to produce something of real value. Yet, the analogy is overly simple and ignores some key differences. It is important that these distinctions are drawn to enable us to think about data and its value in the right way.

The data/oil/commodity analogy

For those of you who haven’t seen Billy Rey Valentine being condescendingly explained the commodities market in Trading Places, it’s probably good to start with a quick definition. Commodities are basic goods and raw materials that are extracted, exchanged and refined. They are agricultural products, coffee beans, gold, oil and of course frozen orange juice. As the alluring narrative goes, data too is mined and refined.

But, data lacks what economists call fungibility: the property of a good or a commodity whose individual units are essentially interchangeable. If I buy electricity from E.ON or EDF, I still expect both sets of kWhs to keep the lights on. In this case, crude oil is extracted, refined and barrelled for use in power generation and the value is the generation of power which is uniform in its output. That barrel of oil had the same teleological journey as the next one.

Data, on the other hand, is differentiated by type and quality. More importantly, the value of data comes from the insight and information one can extract from its raw form; these insights are highly subjective, largely influenced by methodology of analysis and therefore differ wildly through interpretation. Cambridge Analytica had access to a similar ‘barrel’ of data as everyone else. What they did with that barrel, the insights they drew, and their capitalisation of its value set it apart from others.

Another difference in the analogy is that once commodities are used, they often can’t be used again.  Data on the other hand is not a finite resource. It can be generated, used, reused and reinterpreted. Data can be stored and the accumulation of it is highly sought after in the modern information economy. Even when companies go bankrupt and assets get stripped, databases are often considered the most valuable assets. For example, when Caesar’s Entertainment- a gambling giant that pioneered its “Total Rewards” loyalty program- filed for bankruptcy, its most valuable asset was deemed to be this customer service database valued at $1 billion. No wonder companies are keen to get you to reply their GDPR consent emails!

So, as we have explored above, there are real limitations to the data/oil/commodity analogy. But why does it persevere to be alluring? The strength of the data/oil/commodity analogy lies in the fact that data is a valuable asset that is revolutionising business models and driving technological innovation. The ability to collect data and valorise its raw form into insight and information is the fuel of lucrative new businesses and innovative new models—much like oil was at the turn of the last century.

 

Data’s use

Of course, when people think about data it is the tech giants of the modern world such as Facebook, Google and Amazon that come up first. Although Facebook was slightly dented by recent events following the Cambridge Analytica revelations, data still reigns supreme. Google’s recent demonstration of their AI Assistant had people simultaneously in awe and shock at the pace of development of natural language processing and artificial intelligence.

It is not just in Silicon Valley and with internet companies where data is revered; industrial giants and deep-tech early stage companies alike are waking up to the strategic value of data and information. The two largest industrial giants, Siemens and GE are both preparing for the future of industry, where data and the services it can enable will form a key part of corporate strategy. Industrial behemoths like these are increasingly moving towards collecting data and utilising it to improve their ongoing customer relationships and open up new value-added services. This transition will lead to changing business models- a process already under way. Rather than industrial customers buying machinery (products) and maintenance contracts, the likes of Siemens and GE utilise data to provide a continued and long-term service to their customers. Contracts are no longer about just selling products, but delivering ongoing solutions that rely on data. It is an extension of Rolls Royce’s “Power by the Hour” concept developed- well, trademarked in fact- in the 1960s.

Data is spawning innovative technologies from the obvious smart algorithms to engineered hard technologies such as hydro-powered turbines to power smart water networks, novel approaches to asset monitoring and innovative ways to harvest energy to power the sensors that underpin these. Technologies span from smart approaches to data collection and methods to power sensors through to intelligent methods of analysis. The ability, appetite and vision to adopt these new technologies and develop models that the resultant data/information can enable, will lead to winners and losers across different industries. Data isn’t only the fuel of companies like Amazon and Google; it is a lucrative asset that will prove increasingly valuable industries such as energy, manufacturing and farming (to name just a few).

Conclusion

Data, then, can’t be called a commodity and it differs in comparison to sticky, black crude. It is an asset whereby its value stems from the interpretation and transformation of data into information. This information is an important component of our modern economy and will drive strategic diversification in some industries and kill of players who don’t move fast enough with it. Like oil was at the turn of the 20th century, data is a valuable asset that is changing the way our economy operates. It is no wonder that the reformist Saudi Prince, Muhammad bin Salman, pledged $45 billion to SoftBank’s Vision Fund whose focus is on the internet of things, robotics, AI and ride hailing.

Thoughts on Crypto Assets, Initial Coin Offerings, and the Utility Value of Blockchain Technology

Thoughts on Crypto assets, Initial Coin Offerings, and the Utility Value of Blockchain Technology

New to Bitcoin, blockchain, and cryptocurrencies? Read this primer

A new asset class

I am a believer, or maybe I just want to believe. Is this Amsterdam in the 17th century? My view is: no; a new asset class is emerging, and we are about 45 seconds into the evolution of the species.

I have read yet another sceptical article on Seekingalpha this morning, specifically focussed on the Bitcoin (BTC)/Bitcoin Cash (BCH) split.  The author’s supposition is that there is evidence of a bubble in Bitcoin because the combined value of the two coins (BTC and BCH) straight after the split did not closely equate to the value of Bitcoin before the split. Discuss.

It is a reasonable argument that someone coming from equities perspective would (or perhaps should) make. However, there is a large debate to be had around the utility value of the new coin (and the original coin) – this is not a stock split, after all.  Then there is a further debate to be had regarding the value of an asset that is perceived to be neither created in significant quantities, nor destroyed or consumed in significant quantities; this is the gold (aka ‘store of wealth’) argument.

As I see it: Bitcoin (as well as other cryptocurrencies) is currently acting as a store of wealth; the bet you place is that in the future it will remain worth something to someone who also wants to store wealth (or to whom Bitcoin has utility value). From the wealth manager’s perspective, I can also see the portfolio diversification argument: to date, cryptocurrencies have not moved in line with any of the major asset classes (unless we make the argument that quantitative easing related asset value expansion-which appears to have taken place in most asset classes in many major markets- has driven cryptocurrency values upwards).

From a personal perspective, I agree with the portfolio diversification and store of value arguments.  From a professional perspective, I continue to seek to understand how this emerging technology fits in with businesses, which for me, is anywhere it has utility value.

A view on utility

Over the last three years the investment community has made an argument that value lies in the underlying blockchain concept as much, if not more than, the individual ‘currencies’ – and that blockchain use cases can drive value in cryptocurrencies/assets/ tokens (some of the many terms applied – and from here in this article, ‘crypto’) through giving them utility.  Off the back of this narrative we have seen a diverse group of businesses emerge where the phrase ‘blockchain’ appears, to some extent, to be relevant to their business models.  That word alone has led to millions of dollars of capital has been raised through initial coin offerings (ICOs), preselling crypto before its utility value can be unlocked – normally because the environment for its application has not yet been created.

The value of these ICOs has become so significant that major regulators have taken an interest in the market.  I would argue that the prior lack of interest related not so much from a failure to recognise some of these ICOs as ‘pump-and-dump’ schemes, but more because the value involved is low with very few (retail) investors involved.  Not a place to deploy the limited resources of any national regulator.

The first thing that readers should understand is that, as I currently perceive this technology class, there are two aspects that provide utility to businesses: one is as a currency, i.e. as a tool for enabling transactions.  The second is through the crypto token concept, where ‘tokens’ represent a play which is either equity-like (so get regulated if you want to participate here), or as single use objects that can be applied in a specific ecosystem.

Initial coin offerings

These tokens are interesting: one could use these purely to raise capital for a business, and in fact with good governance regime it may make the concept ‘shares’ significantly less relevant – why seek to operate an international business, yet confine business ownership to those who can access the confines of a single regulatory domain (which may not be easy to access to all those who wish to participate in the business)? Instead, one can buy-in at the inception of the business via a token (usually exchanged for Bitcoin), which is subsequently easily transacted on exchanges in the major markets i.e. China and the United States.

Tokens also have a single (/limited) use utility model, i.e. as a non-equity type instrument, enabling an entity to buy in bulk a token at an ICO that will provide utility in markets that currently do not exist, and through doing so providing upfront capital to enable that community to come into existence.

A key part of the process for those seeking to raise capital through an ICO is the ‘white paper’, and I see no likely change to this approach soon.  Somewhat like a share prospectus, a white paper demonstrates to readers how a team (primarily a technology team) intends to use crypto technology (blockchain) to create a marketplace, often to replacing existing markets. The quality of this white paper cannot be understated – it is critical to raising capital in an increasingly educated market.  Other critical elements that support capital raises are emerging to support the white paper – particularly a detailed track record of those on the team (because now some individuals are into their second or third crypto business), as well as the existence of a quality advisory board (and the existence in that team of those with significant experience in the proposed market operation adds considerable weight).

One challenge I foresee for the ICO marketplace is that of credibility – we will have a bust, or a series of busts, because many of the teams who have raised tens, if not hundreds, of millions of dollars in funds will either squander the capital (or incompetently deploy the capital, depending on perspective) whilst seeking to create markets. Investors will lose confidence on many occasions.

Practical application within growth technology businesses

Coming back to the companies I work with, the concept of blockchain and crypto is less interesting from the perspective of ‘imagining’ a new business (and running speculative ICOs), as it is to supporting businesses that already exist.  Many companies, although they will not be aware of it today, will need to implement this type of technology in the future.  This will be either to retain competitive advantage or to source new funds.  Where I believe companies that I work with can leverage significant advantage is where they have an existing business and a proven business model.  Given the nature of crypto tokens – they can be created, destroyed, and traded – and the enthusiasm that exists around ICOs today-they represent an extremely interesting way to propel a business forward.

Yesterday I presented one company I work with to a group involved in fundraising for crypto.  To say that the response was ‘enthusiastic’ would be an understatement.  What they saw was a valid application for blockchain technology (as opposed to paying ‘lip service’ to the concept), along with a significant number of market participants already working within the defined ecosystem – which to me is what these blockchain-based technologies best enable, given their role as a medium of exchange.  To someone seeking differentiation during fundraising in a market dominated by ‘get rich quick’ scheme noise  selling ‘ vapourware’, seeing a real business is something that creates significant excitement.

My subsequent call with another business I am working with moved to crypto.  Within two or three minutes we were discussing how this type of technology could apply to his business – where two days previous there was an awareness of blockchain, but no detail around how support could be provided by the technology class – and within five minutes we had identified how blockchain technology could be leveraged by their business (again, a business with a considerable number of customers, and a strong blockchain applicable model foundation) to provide differentiation, and utility.

The Future

I am a strong advocate of blockchain and crypto, and will continue to be so.  The ICO market is increasingly hard to ignore – if only because of the vast amounts being raised through these crypto sales.  It is certain the crypto market will go through a few booms and busts, but where there is utility value within well understood marketplaces there is a significant opportunity for businesses.  I expect to be working with several companies over the coming months of projects to investigate how this emergent area can bring value to their work.

 

Air pollution: a public health concern

Cycling home over Waterloo bridge a couple of weeks ago I was surprised by my breathlessness and coughing fit that ensued. At first I thought it was a testament to my fitness levels, but turning on the news that night my concern grew from its initial trivial and personal worry.

The cause of all this, as I am sure you have all been reading about, was the unprecedented levels of air pollution which set upon the capital earlier this month.

london smog

Some claimed it was a result of the weather: low wind and high air pressure. While weather contributes to high air pollution episodes, this isn’t an issue to be dismissed.

The UK has broken EU air quality regulations every year since 2010.[1] We often complain about China’s level of pollutants and smog which engulfs its cities. Well, on several occasions between 17th and 24th January, air quality was worse in the UK capital than in Beijing. It is estimated that air pollution causes almost half a million premature deaths in Europe alone.[2]

Hopefully my aggregation of recent news facts should convince you that this is a serious issue.

Indeed, more needs to be done on the international stage with stricter enforcement of legislation. Coming down hard on car companies involved in recent emissions scandals is a good start. Governments also need to be held accountable to ensure they stay under legal air pollution limits.

These reactive punishments will hopefully deter such practices in the future. However, to effectively combat air pollution a proactive policy is necessary; a policy at a political and institutional level but also at a personal one.

In a recent discussion about a circuit-level electricity monitoring technology we are working with, part of its value was neatly summed up by the simple sentence: “you can’t make decisions with your eyes closed.” The same sentence applies here too. A range of air quality sensors and geo-mapping technologies are being utilised to understand where and when pollution is at its worse.

This data alone, though, is not the complete solution to combating air pollution and technology will play a significant role in combating and limiting air pollution in our cities. Chemists and physicists are applying smart technologies to remove toxins from the air. For example, Metal Organic Frameworks, a class of porous nanomaterials, could be used to adsorb certain gasses from the atmosphere or to scrub waste gasses from industrial processes. These porous nanomaterials could also be utilised to make viable alternative fuel sources for transport e.g. Natural Gas Vehicles.

On the topic of transport and vehicles, the proliferation and uptake of battery technology will be significant over the next few years. Cars contribute greatly to the pollutants in the air, and advanced battery technology will enable Electric Vehicles viable for the mass market. Batteries will also be hugely significant in developing  sustainable grid infrastructure, unlocking flexibility in consumption and generation assets.

Air quality is a major public health concern. These technologies will play an important role in reducing the amount of pollutants in the atmosphere.

[1] http://www.independent.co.uk/news/uk/home-news/london-sets-modern-pollution-record-air-quality-sadiq-khan-a7550961.html

[2] http://www.bbc.co.uk/news/world-europe-38078488

Water: it’s a precious resource, let’s start treating it as such!

I initially started writing this blog in the summer when it was so hot that I was struggling to sleep and was drinking water like it was on tap (but it is on tap Peter!).

It was late June at the time and I had decided that it was time for my second annual water blog.  Last year, I wrote about the impact that drought was having on hydro-electricity production in Brazil and agriculture in California, and how increasing droughts could lead to a greater focus on wind, solar and waste to energy technologies, particularly if they could reduce water usage, or, in an example of the latter, extract water from waste.

Unfortunately, although I started the blog, I didn’t get to finish it for various reasons but a realisation in early December motivated me to dust it off (and I promise it had nothing to do with a pressing blog deadline!).  Back then, before the January rains came, it seemed to me that we were having quite a dry winter up to that point, and not that I love the rain, but I didn’t feel this was something to celebrate.  I wasn’t sure at this stage whether this was just distorted perception or if we really were experiencing an unseasonably dry period.

Then I read that 2016 was the hottest year ever recorded, which further motivated me to conclude this blog, especially when Met Office data for December confirmed that rainfall was below normal almost everywhere in England with only 42% of average rainfall overall.

The severity of the crisis

660 million people do not have access to improved drinking water, and while this number is an improvement on previous estimates, it is still a huge number1.  Another 1.2 billion people were estimated to live in areas of physical water scarcity2.  The World Economic Forum ranked the water crisis as the risk likely to have the greatest impact on society3,4.

It’s everyone’s challenge

This year I want to challenge readers (no matter how few you are) to consider how you too can address this great water challenge that we face. And it is a great challenge, even if it’s severity and importance appears to be lost amongst the news of melting glaciers, rising seas, floods and storms associated with climate change.  Equally, when you live in the UK or Ireland, it’s hard to digest the message that there is a water crisis when it appears to rain so much. But, as the Met Office data for December suggests we are not immune to suffering a shortage.

So, whether it’s recycling water, being more efficient with the water you use, capturing rainwater for domestic/commercial use or using cleaner processes that reduce the treatment required for waste water, make a contribution to the challenge.

Addressing the water challenge

Thankfully, there is a host of technologists and companies seeking to tackle the water challenge and I wanted to share a few of those that have recently caught my eye:

  • NVP Energy have solved the challenge of sustainably treating low strength wastewater including at low temperatures using anaerobic bacteria. It reduces CODs by 80%+ and TSS by up to 50%. It generates high quality biogas as a by-product which can be used as an onsite energy source and produces 90% less sludge than alternative treatments.
  • CustoMem is addressing the contamination of water supplies by industrial contaminants. It seeks to treat the 0.04% of micropollutants that are difficult to capture and are also highly toxic such as heavy metals. Furthermore, the solution not only captures the pollutants but enables them to be recycled.
  • MIT researchers have developed the solar vapor generator which uses inexpensive materials to clean and desalinate water. The generator consists of a metallic film, a bespoke sponge and bubble wrap as its skin. It heats, boils and evaporates the water, leaving behind unwanted products.
  • Sundrop farms have sought to address not only the water shortage but also the food and energy shortages in the design of their solar powered sea water desalination plant to irrigate their tomato crops.

Facing into 2017, this has all re-affirmed to me how critical the climate change and water scarcity challenges are for humanity. It has motivated me further to contribute to the solution by supporting the technologists seeking to commercialise solutions and has reinforced how these are everyone’s battles.

 

1: http://www.who.int/water_sanitation_health/monitoring/jmp-2015-key-facts/en/

2: http://www.un.org/waterforlifedecade/scarcity.shtml

3: http://water.org/water-crisis/water-sanitation-facts/

4: http://reports.weforum.org/global-risks-2015/#frame/20ad6

 

Space and early-stage companies

No, I’m not talking about Elon Musk’s deep desire to travel to Mars (or Boeing’s apparently deep desire to beat him there), although I’m keenly watching to see whether we’re going to see a re-enactment of The Martian in my lifespan.

My interest lies in the terrestrial use of space technology – a particular pertinent question to RIG, given one our clients is a cross-over from the space industry, transitioning from a particularly specialised application into a wide variety of terrestrial uses.

When people talk about the cost of space research and how the money could be spent better solving problems on Earth, I often find myself scratching my head – not only is a huge amount of the research applied terrestrially, it’s also quite a significant revenue generator. Space is worth approximately £11.3 billion to the UK economy, with plans to see this value expand to around £40 billion over the next 20 years. It’s currently allocated £370.5 million. In the US, for every dollar spent on space the US economy receives about $8 worth of economic benefit.

As to solving problems on Earth: everything from CAT scanners to LED lighting to solar energy can trace their beginnings to space-led research. Sworn at your smoke detector recently? You can blame NASA for that one.

More recent use of space technology has been led by both large and small companies – a research project in the UK is currently assessing the use of Sentinel 1 radar data to study crop growth. Space algae is being used to fight malnutrition in Congo, software is being used to drive down costs for offshore oil and gas, and sensors are being developed to track the Earth’s atmosphere.

From our own experience, we’ve found that there’s a vast array of potential applications for technology that can be ‘crossed over’ – there’s a lot of excitement in the office around advanced materials at the moment.

Given the return on investment seen and the technology that emerges from it, I find that my real question can’t be ‘why are we spending money on space’ – it’s ‘why aren’t we spending more?’

 

 

Pick your champion

As an early stage technology company, the early deals done with big companies can set the course for the business for some time into the future. Getting them done is rarely simple, but the first step is to make sure you have the right champion.

It is a common misconception that because you are engaged in discussion with someone at a company, that you’re engaged with the company as a whole. It is rare that a large multinational invests all of its expertise, budgets and problems in a single individual or team, and frequently teams are empowered to solve their own problems in an economic way and encouraged to share their solutions with the group. This means that generally speaking there are multiple potential entry points into a company, and it’s wise to take advantage of this.

A second common misconception is that because an individual at a big company fully understands a specific problem or opportunity, that they truly care about solving it or grasping it. In actual fact, the vast majority of big companies have a real mix of cultures and sub-cultures meaning that even in the most board-mandated innovation-orientated environments, individuals can on occasion still get decapitated for taking undue risk or diverting themselves excessively from their day jobs to chase the prospect of a new technological Nirvana. Added to this, some people just don’t care as much as they ought to.

Large multinationals are generally constituted of complex and flowing networks of interests, perceptions and political capital. A good champion will help you to understand and navigate these very human elements, as well as support forward mapping the process to close. If they’re serious about wanting to get the deal done, then it’s in their interests to be open.

And so, at the risk of sounding like a Sunday supplement, a quick round-number checklist to help you to spot whether you have a good one. They:

 

  1. Understand what it does
  2. Understand the value of what it does
  3. Know to whom internally it has value and the nature of the problem it solves
  4. Can explain it to others
  5. Have real personal credibility
  6. Have a track record of on-boarding external technology
  7. Want to get a deal done
  8. Will empathise with your needs and work with you towards a shared goal
  9. Are frank about timing and will support sticking to an agreed timeframe
  10. Will help you to understand the process to getting a deal done, and all of the gates you need to pass through and the boxes to tick

If your primary engagement is with an individual with all of the above characteristics, great – you stand a chance of getting it done. If not, that doesn’t necessarily mean you want to jump off that horse, but it’s well worth considering getting new blood into the stable. Early technology deals should be done as much as possible from the same side of the table, focussed on benefits and structures for win-wins. A good champion works with you towards common objectives.

Getting the right champion is just the beginning, and from there on out its essential to be building your own political capital, and understanding first-hand the inner machinations of the business, but you’ll never understand it nearly as well as when you have a talented insider with aligned interests.

When should I start commercialisation?

The lean approach to software creation has brought market testing much earlier in the life cycle of a product.  Its aim is to try to find market acceptance as soon as possible so that companies minimise the risk of building products that turn out to be not sufficiently compelling.

How does this translate for non-software technology products?

If a product is based on new IP, it’s likely to have quite a long period before a first trial version is market ready. So how early should you start your commercialisation?

There is a concern that ramping up commercialisation efforts too far in advance of production readiness could lead to a loss of any momentum that has been built with potential customers and go-to-market partners. There is a temptation to think that it is better to put your head down and focus on getting to a production-ready model.

However, it’s important to remember that engaging the market serves a number of purposes:

  • There are often multiple parties that will be involved in the sales, implementation, operation and maintenance of a technology. Engaging with them is essential to understand what is required for each of them to adopt the technology. This will be central to the go-to-market strategy

 

  • Working with these parties will give a clearer sense of where the orders for the product will come from in the first 12 to 24 months post-launch. This is a period where sales velocity must be built. Only when they are prepared to shape up distribution or sales deals will it become clear that there is product-market fit. Confirming in advance where the actual orders are likely to come from will help mitigate commercial risk for investors and support valuation

 

  • Understanding why and how these parties will engage and buy is key to structuring a go-to-market strategy and sales process

 

  • In the process of verifying the needs of the market, it is quite possible that information will emerge that will result in changes to the product development path.

 

If market engagement is left too late, this information may not be uncovered. The cost of this in terms of lost time and missing targets will be considerable.