Casey Canfield is about to spend a good portion of the next four years working on an artificial intelligence (AI) system for "marginal" missions, and the National Science Foundation has offered her $1.8 million to do so.
Canfield, an assistant professor of engineering management and systems engineering at the Missouri University of Science and Technology in Rolla, is the principal investigator for a team developing an AI platform intended to assist transplant surgeons across the U.S. in using donated kidneys that hitherto have been deemed as being of "marginal" quality, or fully not suitable for transplantation.
The number of donor kidneys that are discarded is not marginal, however – roughly 20% of kidneys donated by deceased organ donors are never transplanted for various reasons such as age of the donor, the donor suffered from a disease such as diabetes, was obese, or used street drugs. As a result, people are dying who otherwise might have been saved.
A 2018 study led by researchers at Columbia University laid out the dilemma very plainly: "The widening gap between supply and demand for transplantable kidneys has resulted in less than 35% of patients being transplanted within 5 years of waitlisting, while only 36% of patients on dialysis survive five years or more."
Canfield hopes her research, which is being undertaken with collaborators from the St. Louis University School of Medicine and the United Network for Organ Sharing (UNOS), can reduce that discard percentage by about 10% – that two of every 20 kidneys now not used would instead be transplanted and give their recipients a new lease on life.
The platform Canfield and her colleagues are developing will offer adjunct decision support to what she said is the already well-regulated procedure around kidney distribution, in which patients on the match list are prioritized according to numerous factors such as the severity of their renal disease. Canfield said the tool they are developing is intended to help recipients farther down the list than those with top priority.
"You may have a donated kidney that is not in the greatest shape," she said. "Maybe the donor is older, or had comorbidities. The surgeons at the top of the list probably won't want that kidney, but there might be someone further down the list where the benefit/cost ratio is kind of flipped, and they might have to wait a long time for a kidney offer. Their patients could really benefit from this kidney, and that's what we are trying to help with.
"The value of this system will be in some of these marginal cases. Transplant surgeons vary with their risk thresholds and what they are comfortable with. In a situation where a surgeon may be on the fence and the AI says it looks like a good offer, they may go for it when they may not have otherwise."
Canfield said her team is fortunate in that much of the transplant ecosystem's data has already been standardized and normalized, and that much of her work, which she characterizes as "transdisciplinary," will be centered on reconciling the needs of different domain users and ensuring those who use the system trust it.
However, healthcare is not the only economic sector in which AI researchers are developing new tools around the margins. The food industry, for example, has undertaken an effort to make its distribution system more efficient in the wake of the COVID-19 pandemic and the highly visible increased demand for donated food. However, unlike healthcare, the food industry is practically starting from scratch in building the most minute common classifications.
How do you label 'tomato'?
As the pandemic swept across the world in 2020, virtually shutting down sectors of the economy, food banks found themselves swamped with demand for food, and there was plenty of food available that would have otherwise gone to waste; yet those who could get it to people who needed it also relied on antiquated manual processes and institutional knowledge lodged inside employees' and volunteers' brains instead of in data systems. In addition, much of the data that was in systems was siloed and inaccessible to partner organizations that could either supply food, request it, or transport it.
There was no shortage of efforts across the U.S. to implement Big Data solutions to the problems of greatly increased demand for food donations, logistical restrictions precipitated by quarantine and distancing mandates, and financial pressures. Numerous technology sector companies, from behemoths such as Google, with its Food For Good program, to consultancies such as McKinsey, to startups such as Austin, TX-based SmarterSorting, have partnered with food distribution organizations such as Feeding America and local and regional food banks, and grocery chains – Food For Good notably partnered with Cincinnati-based Kroger, the nation's largest supermarket chain by sales revenue.
Linking the granular food data across these entities together was not easy, according to the Food For Good team that put the network into place.
"After building data pipelines to Feeding America and Kroger, the X team's first task was to confront disparities in food descriptors head on," two of the team's members, Joe Intrakamhang and Mike Ryckman, wrote (Google did not respond to several requests for further comment). "How does one name a tomato, describe it, quantify it, and locate it? How do we represent a clamshell container of tomatoes consistently across all datasets from all parties?"
Even within one organization, they wrote, there were disparities in representing the same thing. Feeding America is a nationwide network of 200 independent food banks, each with its own origin stories, practices, and non-corresponding IT systems: "As an example, even something as simple as the name of the state of Texas was logged in 27 different ways!" they pointed out. "This was common throughout the data: for storage facilities, for example, one food bank may refer to their refrigerators as REFR, while another might use REFER.
Even connecting these entities together in a logistical confederation stops short of where the food system should be, according to Matt Lange, CEO of IC-FOODS (International Center for Food Ontology Operability Data and Semantics), a not-for-profit University of California, Davis spinoff developing the semantic platform for an Internet of Food analogous to the Internet of Things.
"We should be thinking not just 'The grocery store has this carton of lettuce we should be getting up to the food bank,' but we should be thinking about an actual Internet of food where you have something that can read and understand food the same way a Web browser can read and understand the Internet," Lange said. "The browser is built on common language and that is what we are talking about, making a common language for food. There are lots of researchers who will say they don't need an ontology to build AI. You don't, but if you want to be able to share your models and the underlying data – if you want a lingua franca, a common language, you want the ontology."
Ideally, he said, an Internet of Food can supply not just common universal machine-readable (and learnable) labels for items such as tomatoes or package sizes, but also dig deeper into the components of discrete foods, such as how much water is in exported alfalfa, or whether a nation is a net exporter or importer of specific nutrients by virtue of the foods they buy and sell.
"We can do this with food composition databases linked to food flows through the transportation system," Lange said. "Now we are in a place where we have a lot of the components, but we haven't built the systems to put them to use for the food system. There is still a lot that needs to be built to bring these pieces together to make sure people across the supply chain are using the same terminologies. And it is a much longer supply chain than that of the medical-industrial complex. And much more diverse."
The more granular the information is in that chain, the more can be gleaned for setting policy. Possible areas where such data could be useful might be around adopting climate-conserving agricultural practices: IC-FOODS is participating in a $3-billion project funded by the U.S. Dept of Agriculture and also in ICICLE.AI, an Ohio State University-led partnership investigating AI applications for creating smart foodsheds, digital agriculture, and animal ecology. Lange said another area where an Internet of Food could introduce much greater efficiency into the policy process would be knowing how public funds are being spent through the Supplemental Nutrition Assistance Program (SNAP).
"You would think the U.S.D.A. (U.S. Department of Agriculture), which administers SNAP, would get back the data of what people purchase with SNAP funds," he said. "It doesn't happen. How much money does the federal government hand out – and grocery stores and food companies are making money hand over fist on this – and they don't have a report on what was actually bought and consumed? It's like walking away without the receipt and thinking, 'I guess that was right'."
How to assure sustainability?
How – or if – new tools such as Canfield's and a food system ontology prove sustainable might be one of the most important questions surrounding technologies put to use in such societal "edge" cases. Canfield is confident any tool that might emerge from her research could find a home in numerous use cases – the transplant process is highly regulated and Canfield said data formats are already fairly well aggregated and normalized at the national level. In addition to more funding from the NSF focusing on implementing the platform, she said it could possibly be licensed to UNOS in a sort of umbrella architecture, or to individual transplant centers.
Implementing a universally intelligent platform for food from production to consumption and disposal – with sustainable benefits for every entity in that network – will be much more difficult, Lange said, even beyond the purely technical aspects. He contends that even the heightened awareness of food banks and the need to adequately supply them around year-end holidays does not provide adequate momentum to sustain a concerted effort around a universal standard for food data.
"It's tough, because we know the Marines are going to be out there with their toy drives, and we know about the holiday season food drives – but typically they are once a year, they are not a real systemic solution. There have to be economic incentives for lots of companies, and there has to be an alignment of what they are doing together with that of doing social and environmental good. What is the path for doing that?
"This is why IC-FOODS exists. It should be made in an open forum, where everybody understands the stakeholders' roles."
Gregory Goth is an Oakville, CT-based writer who specializes in science and technology.
Join the Discussion (0)
Become a Member or Sign In to Post a Comment