Now is the time to fish or cut bait when it comes to Canada’s nascent artificial intelligence (AI) sector. Like other startups, AI firms are facing several challenges including access to capital and qualified staff, regulatory hurdles and antiquated procurement rules. But to succeed, AI companies must also find suitable data; the raw material they need to design algorithms.
Access to data, whether from governments, smart cities or private-sector companies, has now become an existential threat. While Canadian AI startups are on the hunt for elusive data, tech giants such as Google and Facebook, with free access to massive datasets generated through their own platforms, are taking a decisive lead in deploying new products and algorithms. If left unchecked, this situation will suffocate domestic firms on their home turf.
Yet, the potential for growth is real. Recent reports from Canada’s six Economic Strategy Tables all recognize the pressing need for sector-specific national data strategies. Whether it is natural resources, biosciences, agri-food or advanced manufacturing, digitization is seen as a lifeline for enhancing competitiveness and increasing exports. But the lack of interoperability standards and a proper data-sharing infrastructure is causing major hurdles.
For example, agri-food businesses are adopting digital technologies that collect large amounts of data that could benefit all actors in the supply chain. However, data are stored in different formats and on different platforms. A lack of interoperability inhibits the use of shared open-data platforms that could provide important insights and enable new innovations to sprout up.
Data owners and custodians from government and industry need more clarity and predictability in order to share data. Many are sitting on the sidelines because of potential liability downstream.
This is where setting universal standards could help. As explained in “Canada Needs Standards to Support Big Data Analytics,” a policy brief released by the Centre for International Governance Innovation, standards could unleash the potential in Canada’s AI sector by bringing clarity, setting norms and removing liability fears. But without standards, data will remain unreachable and AI investments in Canada will fail.
Unlike wireless services and USB drives that can be operated seamlessly around the world, there is no recognized standard for describing and grading data according to agreed upon attributes, and little transparency regarding data quality. For example, raw data that have not been processed may have more value than data that were subject to manipulation.
Data collection can be automated or performed by individuals. Equipment and sensors used to collect data vary significantly in quality, accuracy and reliability. Some organizations apply rigorous testing and maintenance regimes to equipment while others don’t. All these variables can be used to grade data, improve transparency and reduce potential liability down the road for data owners and custodians.
Government, industry and stakeholders can work together to develop sector-specific open standards allowing for data sharing and utilization. Common standards would facilitate data sharing, reveal opportunities to lower costs and manage resources more effectively. Protocols can be designed to access and manage data flows from increasingly heterogenous sources, allowing AI firms to access and process multiple data sources and design better algorithms.
In addition to interoperability and data standards, Canada also needs to create a new data-sharing infrastructure to allow data owners and custodians to connect with interested AI companies. Smart contracts are necessary to facilitate the access or transfer of datasets. New data pools and trusts should be created in order to aggregate and expand data stocks.
As a first step, it would make sense to design sector-specific, data-sharing infrastructure and incorporate the specific needs and constraints of supply chains serving sectors such as health care, energy, advanced manufacturing or agri-foods.
In a recent report, research firm Forrester said that the lack of an appropriate information architecture and basic data governance issues held back the AI sector in 2018. Moving forward, it predicts that firms will work to put “more potent building blocks in place to accelerate their ability to meet AI’s extraordinary promise.” Canada would benefit from taking the lead in developing standards to properly frame data collection, grade data access and sharing, as well as data analytics. A new architecture underpinning data value chains is needed if we want Canadian AI developers to focus on the next breakthrough algorithms, rather than hunt for elusive data.
This article originally appeared in The Globe and Mail.