Birthing the Center for Population Health Sciences

Baldeep Singh, MD, with staff at Samaritan House

Mark Cullen, MD

Birthing the Center for Population Health Sciences

Mark Cullen, MD

Birthing the Center for Population Health Sciences

We are told to beware of moving parts, and those of us who value our digits and appendages wisely stay out of the way. In the new Center for Population Health Sciences, there are an infinite number of moving parts; standing there in the middle of them is Mark R. Cullen, MD, its director (professor, General Medical Disciplines), bringing order to this new venture.

All new academic endeavors have some similar needs: space, funds, staff, interest. The center has some of these, especially a lot of the latter. “There is incredible interest,” says Cullen enthusiastically. “We are enrolling people who are interested in being affiliate members. We already have 350 from the School of Medicine, and we anticipate another few hundred from across the campus.”

Together with Deputy Director Lorene Nelson, PhD (associate professor, Health Policy and Research), Cullen is creating “a place where health and other forms of data derived from large populations will be made accessible to Stanford faculty and staff supported by our curation services to assist investigators in finding collaborators and analytic support.”

Working Groups
The fundamental unit of these collaborations is the working group, which Cullen describes this way: “What I imagine is that each working group will attract 10 to 12 people who are really interested in a particular project and another 10 or 20 who will be bystanders, watching everything on the intranet we are building to facilitate the work before they get engaged.

“We’ve got 10 working groups that we’re about to spawn,” Cullen continues. “Each targets a problem area or phase of the life-course where there are myriad unanswered questions about the origins of health and disease. Some examples are ‘Sex Differences in Health’ or ‘Retirement, Disability, Cognitive Decline and Aging’ or ‘Immigration and Health.’ It’s hard to know how fast these and the others will gel, but I’ll be disappointed if some don’t begin to gain traction by the end of 2015.

“For every group, I’m trying to group faculty on the main campus with counterparts from the School of Medicine so that there are at least two very distinctive perspectives about what’s important, and different research approaches.”

We are told to beware of moving parts, and those of us who value our digits and appendages wisely stay out of the way. In the new Center for Population Health Sciences, there are an infinite number of moving parts; standing there in the middle of them is Mark R. Cullen, MD, its director (professor, General Medical Disciplines), bringing order to this new venture.

All new academic endeavors have some similar needs: space, funds, staff, interest. The center has some of these, especially a lot of the latter. “There is incredible interest,” says Cullen enthusiastically. “We are enrolling people who are interested in being affiliate members. We already have 350 from the School of Medicine, and we anticipate another few hundred from across the campus.”

Together with Deputy Director Lorene Nelson, PhD (associate professor, Health Policy and Research), Cullen is creating “a place where health and other forms of data derived from large populations will be made accessible to Stanford faculty and staff supported by our curation services to assist investigators in finding collaborators and analytic support.”

Working Groups
The fundamental unit of these collaborations is the working group, which Cullen describes this way: “What I imagine is that each working group will attract 10 to 12 people who are really interested in a particular project and another 10 or 20 who will be bystanders, watching everything on the intranet we are building to facilitate the work before they get engaged.

“We’ve got 10 working groups that we’re about to spawn,” Cullen continues. “Each targets a problem area or phase of the life-course where there are myriad unanswered questions about the origins of health and disease. Some examples are ‘Sex Differences in Health’ or ‘Retirement, Disability, Cognitive Decline and Aging’ or ‘Immigration and Health.’ It’s hard to know how fast these and the others will gel, but I’ll be disappointed if some don’t begin to gain traction by the end of 2015.

“For every group, I’m trying to group faculty on the main campus with counterparts from the School of Medicine so that there are at least two very distinctive perspectives about what’s important, and different research approaches.”

Raw Materials
Some working groups have great ideas but limited access to data or populations ideal for study. Cullen has an answer: “We’ve already bought a big commercial claims set; we are negotiating with the Centers for Medicare and Medicaid Services to buy the Medicare set; and there are literally dozens of fabulous data sets around campus, including the Federal Research Data Center, that need only new coordination to become a researcher’s dream.”

Some more ambitious projects with groups both local and global are also underway. For example, Cullen points to the INDEPTH dataset, about which “we are actually sending a group to meet in Addis.” INDEPTH has surveillance and demographic data on 52 discrete, large populations (10 to 300,000 people each) in 52 Southeast Asian and African countries. He continues: “A core agreement to facilitate exchange with the Danish Registries and Biobank has been executed and three pilot projects have been launched; we are having ongoing discussions with Santa Clara County to develop a health information exchange that will link electronic medical records on almost all county residents irrespective of which health care they use, and further link these to population-level data at the County Health Department. Recently we received expressions of interest from both Singapore and Taiwan about collaborating with their national health authorities, gaining access to additional data troves.”

Cullen cautions that “some of these projects will take several years to mature, but that’s the whole point. We want them to mature under the watchful guidance of the working groups so that people can mold what might come from them.”

Cullen also has plans to support the working groups in novel ways. For instance, “When our intranet is up and going, we will start a resource exchange where people can post projects, ideas, opportunities for postdocs, requirements for a research assistant, etc. A student seeking a particular type of research experience could post that, hoping a faculty member might say, ‘oh great, a student with nothing to do; just what I need for the new study….’“

As grants are funded and donations received, Cullen will achieve another goal: “Someday I’d like to say to the leaders of the working groups, ‘here is $5,000 or $10,000 to help you grow; here’s a full-time staffer to help you write grants; here are two postdoc stipends; here’s a stipend for a visiting scholar to come work with your group.’”

Space
For most academic centers, space is right up there with money as the biggest concern. So too for Cullen. “A lot of working group faculty have no proximity to each other. What would be truly fantastic would be if we had a building, where people in working groups could use a chunk of space; where, for example, every Friday morning the working group on ‘The First 1000 Days of Life’ could meet. There would be hotelling space, good coffee, and quiet group work areas.”

Staffing
The center will not have a huge staff. Cullen explains: “I imagine we will eventually have 10 or 15 faculty who get some support from the center and a professional staff of another 20 people. We are shortly merging with the Office for Community Health, which already has a staff of 10. It will be the feet-on-the-ground link with the community health centers nearby, plus it will drive some of the education around population health.”

Funding
The Center received its initial operating budget from The Stanford Center for Clinical and Translational Research, Stanford Health Care, and the Dean of the School of Medicine, along with a future allocation of space and resources to attract new and promising faculty. The challenge is to develop a stream of revenue from grants, and through philanthropy raise the resources needed to become a sustainable fixture.

“We are trying to write some grants which themselves could generate immediate payback in terms of resources,” says Cullen. “For example, we are responding to a request for applications from the National Institute for Minority Health and Health Disparities to develop a center focused on using tools of precision health to address health disparities. If we’re successful, that would produce substantial resources to jump-start several working groups, including one on ‘Health Disparities’ and another on ‘Gene-Environment Interactions,’ as well as the Office for Community Health.”

It is obvious this is a work in progress, with many moving parts and uncertainties. But the director of this center has dreams and enthusiasm and plans to make it all come true. “It’s exciting precisely because it’s not all pat and set in stone,” he says. “There’s so much opportunity for innovation, for experimentation, and for leadership and members alike to shape and mold those future dreams.”

Raw Materials
Some working groups have great ideas but limited access to data or populations ideal for study. Cullen has an answer: “We’ve already bought a big commercial claims set; we are negotiating with the Centers for Medicare and Medicaid Services to buy the Medicare set; and there are literally dozens of fabulous data sets around campus, including the Federal Research Data Center, that need only new coordination to become a researcher’s dream.”

Some more ambitious projects with groups both local and global are also underway. For example, Cullen points to the INDEPTH dataset, about which “we are actually sending a group to meet in Addis.” INDEPTH has surveillance and demographic data on 52 discrete, large populations (10 to 300,000 people each) in 52 Southeast Asian and African countries. He continues: “A core agreement to facilitate exchange with the Danish Registries and Biobank has been executed and three pilot projects have been launched; we are having ongoing discussions with Santa Clara County to develop a health information exchange that will link electronic medical records on almost all county residents irrespective of which health care they use, and further link these to population-level data at the County Health Department. Recently we received expressions of interest from both Singapore and Taiwan about collaborating with their national health authorities, gaining access to additional data troves.”

Cullen cautions that “some of these projects will take several years to mature, but that’s the whole point. We want them to mature under the watchful guidance of the working groups so that people can mold what might come from them.”

Cullen also has plans to support the working groups in novel ways. For instance, “When our intranet is up and going, we will start a resource exchange where people can post projects, ideas, opportunities for postdocs, requirements for a research assistant, etc. A student seeking a particular type of research experience could post that, hoping a faculty member might say, ‘oh great, a student with nothing to do; just what I need for the new study….’“

As grants are funded and donations received, Cullen will achieve another goal: “Someday I’d like to say to the leaders of the working groups, ‘here is $5,000 or $10,000 to help you grow; here’s a full-time staffer to help you write grants; here are two postdoc stipends; here’s a stipend for a visiting scholar to come work with your group.’”

Space
For most academic centers, space is right up there with money as the biggest concern. So too for Cullen. “A lot of working group faculty have no proximity to each other. What would be truly fantastic would be if we had a building, where people in working groups could use a chunk of space; where, for example, every Friday morning the working group on ‘The First 1000 Days of Life’ could meet. There would be hotelling space, good coffee, and quiet group work areas.”

Staffing
The center will not have a huge staff. Cullen explains: “I imagine we will eventually have 10 or 15 faculty who get some support from the center and a professional staff of another 20 people. We are shortly merging with the Office for Community Health, which already has a staff of 10. It will be the feet-on-the-ground link with the community health centers nearby, plus it will drive some of the education around population health.”

Funding
The Center received its initial operating budget from The Stanford Center for Clinical and Translational Research, Stanford Health Care, and the Dean of the School of Medicine, along with a future allocation of space and resources to attract new and promising faculty. The challenge is to develop a stream of revenue from grants, and through philanthropy raise the resources needed to become a sustainable fixture.

“We are trying to write some grants which themselves could generate immediate payback in terms of resources,” says Cullen. “For example, we are responding to a request for applications from the National Institute for Minority Health and Health Disparities to develop a center focused on using tools of precision health to address health disparities. If we’re successful, that would produce substantial resources to jump-start several working groups, including one on ‘Health Disparities’ and another on ‘Gene-Environment Interactions,’ as well as the Office for Community Health.”

It is obvious this is a work in progress, with many moving parts and uncertainties. But the director of this center has dreams and enthusiasm and plans to make it all come true. “It’s exciting precisely because it’s not all pat and set in stone,” he says. “There’s so much opportunity for innovation, for experimentation, and for leadership and members alike to shape and mold those future dreams.”

An App to Improve Heart Health

Baldeep Singh, MD, with staff at Samaritan House

Alan Yeung, MD, and Michael McConnell, MD

An App to Improve Heart Health

Alan Yeung, MD, and Michael McConnell, MD

An App to Improve Heart Health

In March, Stanford cardiologists launched MyHeart Counts—a new mobile app that enables users to contribute to a large-scale study of heart health while learning about their own cardiovascular risk.

The public reception was overwhelming. To date, over 41,000 users have signed up for the free app and consented to participate in the study, and the number continues to climb. Apps may be a relatively new frontier of medicine, but they have the potential to reach large populations that traditional medical studies can’t. “There have been larger research studies, particularly national efforts to study their populations, but we believe enrolling this many participants in such a short time frame is unprecedented,” Michael McConnell, MD (professor, Cardiovascular Medicine), told Stanford Medicine earlier this year.

The goal of MyHeart Counts, McConnell said, is “to be the largest study of measured physical activity and cardiovascular health to date.” He continued, “We want people to join in this research effort to give them personalized information about their heart health and help provide fundamental new insights into how activity helps your heart, across all ages, genders, cultures, and countries.”

MyHeart Counts is one of the first five apps to use Apple’s ResearchKit, an open source software framework specifically designed for medical and health research.

The app relies on questionnaires, surveys, and the iPhone’s built-in motion sensors to collect data on cardiac risk factors, lifestyle behaviors, and physical activity. 

After an initial survey of basic health information – including age, weight, sleep patterns, daily exercise routines – users participate in a seven-day assessment of physical activity and complete a six-minute walk. Participants are then asked to check in with the app every three months.

Once users’ data has been collected, it is then used for research. As McConnell explained: “There are two major elements to the study. One is collecting data as broadly as possible on physical activity, fitness, and cardiovascular risk factors, which provides important feedback to the participants and helpful research data for our study. The second is studying ways to help people enhance activity and fitness, and decrease their chances of heart disease.”

In March, Stanford cardiologists launched MyHeart Counts—a new mobile app that enables users to contribute to a large-scale study of heart health while learning about their own cardiovascular risk.

The public reception was overwhelming. To date, over 41,000 users have signed up for the free app and consented to participate in the study, and the number continues to climb. Apps may be a relatively new frontier of medicine, but they have the potential to reach large populations that traditional medical studies can’t. “There have been larger research studies, particularly national efforts to study their populations, but we believe enrolling this many participants in such a short time frame is unprecedented,” Michael McConnell, MD (professor, Cardiovascular Medicine), told Stanford Medicine earlier this year.

The goal of MyHeart Counts, McConnell said, is “to be the largest study of measured physical activity and cardiovascular health to date.” He continued, “We want people to join in this research effort to give them personalized information about their heart health and help provide fundamental new insights into how activity helps your heart, across all ages, genders, cultures, and countries.”

MyHeart Counts is one of the first five apps to use Apple’s ResearchKit, an open source software framework specifically designed for medical and health research.

The app relies on questionnaires, surveys, and the iPhone’s built-in motion sensors to collect data on cardiac risk factors, lifestyle behaviors, and physical activity. After an initial survey of basic health information – including age, weight, sleep patterns, daily exercise routines – users participate in a seven-day assessment of physical activity and complete a six-minute walk. Participants are then asked to check in with the app every three months.

Open source “Apple Research Kit” and apps like “My Heart Counts” could have big effect on patient-centered research!

— Josh Knowles (@joshuawknowles) March 10, 2015

Once users’ data has been collected, it is then used for research. As McConnell explained: “There are two major elements to the study. One is collecting data as broadly as possible on physical activity, fitness, and cardiovascular risk factors, which provides important feedback to the participants and helpful research data for our study. The second is studying ways to help people enhance activity and fitness, and decrease their chances of heart disease.”

Five months after its debut, researchers launched MyHeart Counts in Hong Kong and the United Kingdom. At the same time, they released a new version of the app that focuses on providing participants with more feedback about their individual behaviors and risk, and compares an individual user’s fitness data to other participants.

“We are very excited to be able to take MyHeart Counts global,” said Euan Ashley, MD (professor, Cardiovascular Medicine), a co-investigator for the MyHeart Counts study. “Cardiovascular disease is the number one killer worldwide, and we have an unprecedented opportunity to study risk factors such as physical activity, fitness, and sleep in countries around the world.”

Open source “Apple Research Kit” and apps like “My Heart Counts” could have big effect on patient-centered research!

— Josh Knowles (@joshuawknowles) March 10, 2015

Five months after its debut, researchers launched MyHeart Counts in Hong Kong and the United Kingdom. At the same time, they released a new version of the app that focuses on providing participants with more feedback about their individual behaviors and risk, and compares an individual user’s fitness data to other participants.

“We are very excited to be able to take MyHeart Counts global,” said Euan Ashley, MD (professor, Cardiovascular Medicine), a co-investigator for the MyHeart Counts study. “Cardiovascular disease is the number one killer worldwide, and we have an unprecedented opportunity to study risk factors such as physical activity, fitness, and sleep in countries around the world.”

Endoscopic Submucosal Dissection

Baldeep Singh, MD, with staff at Samaritan House

Shai Friedland, MD

Endoscopic Submucosal Dissection

Shai Friedland, MD

Endoscopic Submucosal Dissection

Maybe it’s the sushi, or maybe it’s the Korean barbecue, but for some reason stomach cancer is more prevalent in Asia than in the United States. That’s why 10 years ago doctors in Japan developed a minimally invasive technique called endoscopic submucosal dissection to overcome the technical limitations of removing early gastric (stomach) cancer with other endoscopic tools.

About two years ago Shai Friedland, MD (associate professor, Gastroenterology and Hepatology), began performing the procedure at Stanford. That was after Friedland met several Japanese and Korean pioneers of the technique, observed them perform the procedure in Korea, attended courses they had taught in the United States, and practiced the technique under their careful supervision.

To date, Friedland has performed about 50 cases, and he’s currently collaborating with Dong-Hoon Yang, MD, a clinical associate professor at Asan Medical Center in Seoul, Korea, on a manuscript about a simplified endoscopic submucosal dissection technique in the colon. The two doctors are comparing the success of the technique at the two institutions, and they expect the paper to show that the technique is successful in both countries.

Because relatively few patients in the US have the stomach lesions that would merit the procedure, only a couple of doctors in this country have had an opportunity to perform endoscopic submucosal dissection, a procedure that usually takes one to two hours.

“The procedure is very challenging technically to perform, and it is relatively risky, especially for a doctor who is not very experienced in the technique,” says Friedland.

However, the procedure has many advantages over standard treatment methods.

“The endoscopic technique that this replaces is known as EMR – endoscopic mucosal resection,” Friedland points out. “That’s a technique where you also inject fluid underneath the lesion, but you use a snare, which is like a lasso with an electric cautery, to remove one piece at a time until the whole lesion is removed. That technique is suitable for very small lesions or when you don’t care about removing the lesion all in one piece.

Maybe it’s the sushi, or maybe it’s the Korean barbecue, but for some reason stomach cancer is more prevalent in Asia than in the United States. That’s why 10 years ago doctors in Japan developed a minimally invasive technique called endoscopic submucosal dissection to overcome the technical limitations of removing early gastric (stomach) cancer with other endoscopic tools.

About two years ago Shai Friedland, MD (associate professor, Gastroenterology and Hepatology), began performing the procedure at Stanford. That was after Friedland met several Japanese and Korean pioneers of the technique, observed them perform the procedure in Korea, attended courses they had taught in the United States, and practiced the technique under their careful supervision.

To date, Friedland has performed about 50 cases, and he’s currently collaborating with Dong-Hoon Yang, MD, a clinical associate professor at Asan Medical Center in Seoul, Korea, on a manuscript about a simplified endoscopic submucosal dissection technique in the colon. The two doctors are comparing the success of the technique at the two institutions, and they expect the paper to show that the technique is successful in both countries.

Because relatively few patients in the US have the stomach lesions that would merit the procedure, only a couple of doctors in this country have had an opportunity to perform endoscopic submucosal dissection, a procedure that usually takes one to two hours.

“The procedure is very challenging technically to perform, and it is relatively risky, especially for a doctor who is not very experienced in the technique,” says Friedland.

However, the procedure has many advantages over standard treatment methods.

“The endoscopic technique that this replaces is known as EMR – endoscopic mucosal resection,” Friedland points out. “That’s a technique where you also inject fluid underneath the lesion, but you use a snare, which is like a lasso with an electric cautery, to remove one piece at a time until the whole lesion is removed. That technique is suitable for very small lesions or when you don’t care about removing the lesion all in one piece. We use that technique with a lot of colon polyps because they’re more benign than these stomach cancers, and it seems to work pretty well in those instances. But for earlier stomach cancer, EMR is really inferior to endoscopic submucosal dissection. In those cases it’s important to remove the lesion in one piece, and those lesions are often fairly large—much larger than a snare can get.”

Often when there are larger lesions in the stomach, the recommended treatment is a total gastrectomy, which is open surgery to remove the entire stomach and connect the esophagus directly to the intestine.

“While a total gastrectomy is not overly complex and takes only a few hours, it is generally very difficult for patients to live well and eat well after that kind of surgery. They’ve lost their entire stomach, which means they then can no longer eat large meals, they can’t enjoy their food as much as they did before, and they lose a lot of weight,” Friedland says.

Before development of endoscopic submucosal dissection it was only possible to remove relatively small lesions in one piece, which the Japanese found to be sub-optimal for early gastric cancers, according to the Stanford professor.

“Because we’re just removing the mucosa—the inner lining of the stomach—the wound heals on its own in a few weeks, and the patient is basically left with a stomach that works as well as it did before. So that’s really the great advantage of these minimally invasive treatments,” Friedland says.

Endoscopic submucosal dissection is ideally suited for selected patients with pre-cancerous conditions or early cancer in their stomach, esophagus, colon, or rectum.

We use that technique with a lot of colon polyps because they’re more benign than these stomach cancers, and it seems to work pretty well in those instances. But for earlier stomach cancer, EMR is really inferior to endoscopic submucosal dissection. In those cases it’s important to remove the lesion in one piece, and those lesions are often fairly large—much larger than a snare can get.”

Often when there are larger lesions in the stomach, the recommended treatment is a total gastrectomy, which is open surgery to remove the entire stomach and connect the esophagus directly to the intestine.

“While a total gastrectomy is not overly complex and takes only a few hours, it is generally very difficult for patients to live well and eat well after that kind of surgery. They’ve lost their entire stomach, which means they then can no longer eat large meals, they can’t enjoy their food as much as they did before, and they lose a lot of weight,” Friedland says.

Before development of endoscopic submucosal dissection it was only possible to remove relatively small lesions in one piece, which the Japanese found to be sub-optimal for early gastric cancers, according to the Stanford professor.

“Because we’re just removing the mucosa—the inner lining of the stomach—the wound heals on its own in a few weeks, and the patient is basically left with a stomach that works as well as it did before. So that’s really the great advantage of these minimally invasive treatments,” Friedland says.

Endoscopic submucosal dissection is ideally suited for selected patients with pre-cancerous conditions or early cancer in their stomach, esophagus, colon, or rectum.

The Search for Uremic Toxins

Baldeep Singh, MD, with staff at Samaritan House

Tammy Sirich, MD, and Timothy Meyer, MD

The Search for Uremic Toxins

Tammy Sirich, MD, and Timothy Meyer, MD

The Search for Uremic Toxins

Kidney dialysis has not changed much since it was first introduced to a broad public in the 1960s as a miraculous life-saving system to clean the blood when natural kidney function fails. Although dialysis continues to save lives, it does only about 10 percent of what a functioning kidney can do to remove toxic wastes (called “uremic toxins”) from the blood stream. A patient on dialysis faces an exhausting and time-consuming process multiple times a week, with the prospect of serious health problems, from heart and bone disease to anemia, and a significantly shortened life expectancy once dialysis begins.

The problem is that after dialysis, patients continue to suffer from a previously non-existent, life-threatening disease that has been called “residual syndrome.” Scientists now believe that this syndrome that strikes down dialysis patients is probably caused by as-yet unidentified toxic molecules that remain in the blood stream when they are not removed by dialysis. It is known that dialysis removes urea from the patient’s blood, to alleviate symptoms after kidney failure.

What if dialysis could identify and target the remaining toxins, among the hundreds of “waste” molecules left in the bloodstream after urea has been removed, so patients could live longer, healthier lives after treatment. That has been the focus of a decade of investigations by Timothy Meyer, MD (professor, Nephrology), and new studies with his colleague, Tammy Sirich, MD (instructor, Nephrology). Both specialize in kidney research, along with the care and treatment of patients with kidney disease at Stanford and its affiliate Palo Alto Veterans Affairs (VA) Hospital.

Meyer and Sirich are determined to change the way that dialysis works.

“We think that dialysis patients still feel sick because many different substances could be removed by dialysis—but we have not yet identified which of those left in the bloodstream after treatment are the harmful ones,” explains Meyer. “It is shocking that with all the technology at our disposal, we have not yet been able to identify exactly which chemicals are the ones that cause illness when the kidneys fail.”

“This is a chemistry problem with a solution that can change the face of dialysis,” Meyer says. “We were both chemistry majors in college, which predisposes you to go into nephrology to study the waste chemicals that the kidney cleans from the body.” Their background in chemistry has led them to the current investigations.

It is Meyer and Sirich’s goal to find new ways to establish the chemical identity of the specific molecules that make patients sick. Scientists have characterized over 200 molecules (called “solutes”) that appear in high concentrations in the blood after kidney failure occurs, and there could be thousands more. It is known that certain classes of solutes are removed less well by dialysis than urea, including those that are protein-bound, relatively large ones, sequestered compounds, and substances removed by the normal kidney at rates higher than urea. But until recently, there were no good analytical tools to determine which of these were the uremic toxins—that caused patient symptoms and illness.

Kidney dialysis has not changed much since it was first introduced to a broad public in the 1960s as a miraculous life-saving system to clean the blood when natural kidney function fails. Although dialysis continues to save lives, it does only about 10 percent of what a functioning kidney can do to remove toxic wastes (called “uremic toxins”) from the blood stream. A patient on dialysis faces an exhausting and time-consuming process multiple times a week, with the prospect of serious health problems, from heart and bone disease to anemia, and a significantly shortened life expectancy once dialysis begins.

The problem is that after dialysis, patients continue to suffer from a previously non-existent, life-threatening disease that has been called “residual syndrome.” Scientists now believe that this syndrome that strikes down dialysis patients is probably caused by as-yet unidentified toxic molecules that remain in the blood stream when they are not removed by dialysis. It is known that dialysis removes urea from the patient’s blood, to alleviate symptoms after kidney failure.

What if dialysis could identify and target the remaining toxins, among the hundreds of “waste” molecules left in the bloodstream after urea has been removed, so patients could live longer, healthier lives after treatment. That has been the focus of a decade of investigations by Timothy Meyer, MD (professor, Nephrology), and new studies with his colleague, Tammy Sirich, MD (instructor, Nephrology). Both specialize in kidney research, along with the care and treatment of patients with kidney disease at Stanford and its affiliate Palo Alto Veterans Affairs (VA) Hospital.

Meyer and Sirich are determined to change the way that dialysis works.

“We think that dialysis patients still feel sick because many different substances could be removed by dialysis—but we have not yet identified which of those left in the bloodstream after treatment are the harmful ones,” explains Meyer. “It is shocking that with all the technology at our disposal, we have not yet been able to identify exactly which chemicals are the ones that cause illness when the kidneys fail.”

“This is a chemistry problem with a solution that can change the face of dialysis,” Meyer says. “We were both chemistry majors in college, which predisposes you to go into nephrology to study the waste chemicals that the kidney cleans from the body.” Their background in chemistry has led them to the current investigations.

It is Meyer and Sirich’s goal to find new ways to establish the chemical identity of the specific molecules that make patients sick. Scientists have characterized over 200 molecules (called “solutes”) that appear in high concentrations in the blood after kidney failure occurs, and there could be thousands more. It is known that certain classes of solutes are removed less well by dialysis than urea, including those that are protein-bound, relatively large ones, sequestered compounds, and substances removed by the normal kidney at rates higher than urea. But until recently, there were no good analytical tools to determine which of these were the uremic toxins—that caused patient symptoms and illness.

This is like searching for a needle in a haystack, to solve a major clinical problem in the field of kidney disease

Meyer’s research has focused on elucidating the cellular and pathophysiologic mechanisms responsible for the progression of kidney disease. His work includes studies of which molecules are toxic, how these are produced by the body, and how their production could be decreased or their removal could be increased.

Meyer and Sirich now want to identify the toxic substances causing harm in the bloodstream—to provide a more rational basis for prescribing dialysis to patients before they become seriously ill. Ultimately it could lead to improved treatment of patients with kidney failure.

The mass spectrometer is what first brought Meyer and Sirich together in their search for uremic toxins. Sometimes called the smallest scale in the world, the mass spectrometer is an analytical chemistry device with software and detection tools that can measure the size and volume of atoms and molecules. It can identify the specific chemicals in a sample.

“We were both interested in identifying uremic toxins, and we were both interested in using mass spectrometry to characterize the toxic solutes in the blood that were poisoning our patients,” Sirich recalls.

Meyer had just acquired a mass spectrometer for his research lab when Sirich joined his team as a research fellow and chose “the search for uremic toxins” as her research focus. Together, they began to unravel the candidates for “most toxic solute” in the waste chemicals they found in samples from patients who were on dialysis, as compared with the compounds found in patients with healthy kidney function. They learned what mass spectrometry could do to identify the mass and abundance of the compounds they found. With a grant from the National Institutes of Health (NIH) in 2008 they began to study patient samples in a large dialysis cohort, and they have since received additional funding from the NIH and the Department of Veterans Affairs to continue their work in the field.

“We use mass spectrometry to examine the biochemical garbage that is left after dialysis is done, and our goal is to sort out which streams of garbage—which solutes left in the bloodstream after dialysis—are causing so many symptoms for patients,” Meyer says. They use sophisticated metabolic studies to identify and characterize small molecules in the blood, and then establish which ones appear in the highest concentrations in patients with kidney failure and disease symptoms.

Meyer and Sirich have characterized new solutes in the blood of patients after kidney failure, with their mass spectrometry studies of patient samples. The investigators have also employed untargeted mass spectrometry to identify ones that are protein-bound and that are most efficiently cleared by the kidney, and they believe that further analysis of those could indicate a route to the identification of other harmful substances.

Their studies to date have focused largely on two specific protein-bound molecules that may turn out to be uremic toxins in dialysis patients. Indoxyl sulfate and p-cresyl sulfate may contribute to cardiovascular disease in kidney failure; and indoxyl sulfate may also contribute to progression of kidney disease. These are among the large number of waste substances produced by colon microbes; and because they are made by microbes in an isolated compartment, they may prove simpler to suppress than other kidney waste.

A clinical trial that derives from this work, Dietary Maneuvers to Reduce Production of Colon-Derived Uremic Solutes, directed by Meyer, is now recruiting patients to evaluate whether dietary fiber supplements can reduce production of chemicals produced by colon bacteria that build up in the body in patients on dialysis.

Further studies and expanded clinical trials are the next steps in the search for uremic toxins. Although it is now possible to reduce the levels of some solutes by modifying the dialysis procedure or by limiting production, clinical trials must determine if these changes will clinically benefit dialysis patients.

“This is like searching for a needle in a haystack, to solve a major clinical problem in the field of kidney disease,” explains Sirich. “But our studies could impact all the patients that we see every day at Stanford and the VA, and more than 350,000 kidney patients who are on dialysis in the US and beyond.”

This is like searching for a needle in a haystack, to solve a major clinical problem in the field of kidney disease

Meyer’s research has focused on elucidating the cellular and pathophysiologic mechanisms responsible for the progression of kidney disease. His work includes studies of which molecules are toxic, how these are produced by the body, and how their production could be decreased or their removal could be increased.

Meyer and Sirich now want to identify the toxic substances causing harm in the bloodstream—to provide a more rational basis for prescribing dialysis to patients before they become seriously ill. Ultimately it could lead to improved treatment of patients with kidney failure.

The mass spectrometer is what first brought Meyer and Sirich together in their search for uremic toxins. Sometimes called the smallest scale in the world, the mass spectrometer is an analytical chemistry device with software and detection tools that can measure the size and volume of atoms and molecules. It can identify the specific chemicals in a sample.

“We were both interested in identifying uremic toxins, and we were both interested in using mass spectrometry to characterize the toxic solutes in the blood that were poisoning our patients,” Sirich recalls.

Meyer had just acquired a mass spectrometer for his research lab when Sirich joined his team as a research fellow and chose “the search for uremic toxins” as her research focus. Together, they began to unravel the candidates for “most toxic solute” in the waste chemicals they found in samples from patients who were on dialysis, as compared with the compounds found in patients with healthy kidney function. They learned what mass spectrometry could do to identify the mass and abundance of the compounds they found. With a grant from the National Institutes of Health (NIH) in 2008 they began to study patient samples in a large dialysis cohort, and they have since received additional funding from the NIH and the Department of Veterans Affairs to continue their work in the field.

“We use mass spectrometry to examine the biochemical garbage that is left after dialysis is done, and our goal is to sort out which streams of garbage—which solutes left in the bloodstream after dialysis—are causing so many symptoms for patients,” Meyer says. They use sophisticated metabolic studies to identify and characterize small molecules in the blood, and then establish which ones appear in the highest concentrations in patients with kidney failure and disease symptoms.

Meyer and Sirich have characterized new solutes in the blood of patients after kidney failure, with their mass spectrometry studies of patient samples. The investigators have also employed untargeted mass spectrometry to identify ones that are protein-bound and that are most efficiently cleared by the kidney, and they believe that further analysis of those could indicate a route to the identification of other harmful substances.

Their studies to date have focused largely on two specific protein-bound molecules that may turn out to be uremic toxins in dialysis patients. Indoxyl sulfate and p-cresyl sulfate may contribute to cardiovascular disease in kidney failure; and indoxyl sulfate may also contribute to progression of kidney disease. These are among the large number of waste substances produced by colon microbes; and because they are made by microbes in an isolated compartment, they may prove simpler to suppress than other kidney waste.

A clinical trial that derives from this work, Dietary Maneuvers to Reduce Production of Colon-Derived Uremic Solutes, directed by Meyer, is now recruiting patients to evaluate whether dietary fiber supplements can reduce production of chemicals produced by colon bacteria that build up in the body in patients on dialysis.

Further studies and expanded clinical trials are the next steps in the search for uremic toxins. Although it is now possible to reduce the levels of some solutes by modifying the dialysis procedure or by limiting production, clinical trials must determine if these changes will clinically benefit dialysis patients.

“This is like searching for a needle in a haystack, to solve a major clinical problem in the field of kidney disease,” explains Sirich. “But our studies could impact all the patients that we see every day at Stanford and the VA, and more than 350,000 kidney patients who are on dialysis in the US and beyond.”

Defusing Leukemia Killer Cells: From Basic Stem Cell Research to Therapeutic Human Drug Development

Baldeep Singh, MD, with staff at Samaritan House

Ravi Majeti, MD, PhD

Defusing Leukemia Killer Cells: From Basic Stem Cell Research to Therapeutic Human Drug Development

Ravi Majeti, MD, PhD

Defusing Leukemia Killer Cells: From Basic Stem Cell Research to Therapeutic Human Drug Development

Physician-scientists at Stanford are nurturing the equivalent of a biotech start-up, right in the halls of the university, to develop a new therapy for acute myeloid leukemia (AML), a rapidly fatal cancer of the blood and bone marrow. It is “unprecedented,” explains Ravi Majeti, MD, PhD (associate professor, Hematology), who is part of a team of researchers that used advanced techniques to identify and test an antibody that could lead the immune system to “eat” the cancer-forming cells in leukemia and solid tumors. Starting with discoveries in the lab, the researchers found that one particular protein marker on the surface of leukemia stem cells – CD47 – is a culprit in AML. That has led to a new experimental approach to treat AML without chemotherapy or bone marrow transplantation, with an antibody therapy that interferes with the actions of CD47 and defuses the killer cancer cells. Now the CD47 Disease Team has developed and manufactured a clinical grade therapeutic targeting CD47 on cancer stem cells, which could eliminate those cancer-forming cells in AML while preserving normal, healthy stem cells.

“AML is the cancer with the strongest evidence for the critical involvement of cancer stem cells”, says Majeti, “and it is the most common acute leukemia affecting adults. Targeted antibody treatment offers the possibility of improved clinical outcomes for AML.” Currently, most patients with AML will die within the first year of diagnosis, even with aggressive chemotherapy and bone marrow transplantation; and five-year overall AML survival rates are as low as 30 percent.

“We are really hoping to make a difference in this disease,” Majeti continues. “We have embarked on a massive undertaking inside the university to make an impact on the outcomes of patients with AML. We have manufactured a drug that targets the CD47 molecule, and we have an open, active clinical trial at Stanford. We did basically what a start-up biotech company would do, but we got some unique grant funding to allow us to do it all inside the university. It’s been a new experience both for the university and for us as researchers.” Clinical trials at Stanford have begun in patients with solid tumors, to measure tumor response and safety, and trials with AML patients will begin towards the end of 2015.

Their work is part of a collaborative effort in hematology that crosses the campus and also includes researchers at the pioneering AML Working Group in the UK. “We established the CD47 Disease Team as an integrated program, with a highly collaborative group of scientists and clinicians working to bring CD47-targeted therapies to patients,” says Majeti. The team includes important clinical and research partners in the US and the UK – from investigators who see patients in clinical trials, to leading drug development experts. The CD47 Disease Team was awarded funding for this groundbreaking translational research from the California Institute for Regenerative Medicine

The targeted antibody treatment for AML is an outgrowth of research by Majeti with Irving Weissman, MD (professor, Pathology), a leading investigator in the biology and translational applications of normal blood stem cells and cancer stem cells. In the 1990s, Weissman helped develop the then-controversial theory that there are specific cancer stem cells within the whole tumor or cancer mass, and that those cells drive the growth of the cancer.

He hypothesized that molecules only found on the cancer stem cells could be targeted by antibody therapeutics to eliminate cancer and cure patients. In parallel, Weissman identified a method to remove and purify the patient’s own blood-forming stem cells and return them to the body, purged of cancer, so patients could rebuild their blood supply from scratch.

Further research by Majeti, along with Weissman and a team of researchers in 2012, set the stage for a new understanding of how AML develops. Their research proved that leukemia develops from mutations that occur in blood stem cells and also determined the order in which such mutations occur. Weissman directs Stanford’s Institute for Stem Cell Biology and Regenerative Medicine and the Ludwig Center for Cancer Stem Cell Research and Medicine. Majeti, who is also a member of the Stanford Cancer Institute and the Institute for Stem Cell Biology and Regenerative Medicine, pointed out that having the correct model of how leukemias arise is important because it may eventually help determine what kind of therapy might be most effective.

Physician-scientists at Stanford are nurturing the equivalent of a biotech start-up, right in the halls of the university, to develop a new therapy for acute myeloid leukemia (AML), a rapidly fatal cancer of the blood and bone marrow. It is “unprecedented,” explains Ravi Majeti, MD, PhD (associate professor, Hematology), who is part of a team of researchers that used advanced techniques to identify and test an antibody that could lead the immune system to “eat” the cancer-forming cells in leukemia and solid tumors. Starting with discoveries in the lab, the researchers found that one particular protein marker on the surface of leukemia stem cells – CD47 – is a culprit in AML. That has led to a new experimental approach to treat AML without chemotherapy or bone marrow transplantation, with an antibody therapy that interferes with the actions of CD47 and defuses the killer cancer cells. Now the CD47 Disease Team has developed and manufactured a clinical grade therapeutic targeting CD47 on cancer stem cells, which could eliminate those cancer-forming cells in AML while preserving normal, healthy stem cells.

“AML is the cancer with the strongest evidence for the critical involvement of cancer stem cells”, says Majeti, “and it is the most common acute leukemia affecting adults. Targeted antibody treatment offers the possibility of improved clinical outcomes for AML.” Currently, most patients with AML will die within the first year of diagnosis, even with aggressive chemotherapy and bone marrow transplantation; and five-year overall AML survival rates are as low as 30 percent.

“We are really hoping to make a difference in this disease,” Majeti continues. “We have embarked on a massive undertaking inside the university to make an impact on the outcomes of patients with AML. We have manufactured a drug that targets the CD47 molecule, and we have an open, active clinical trial at Stanford. We did basically what a start-up biotech company would do, but we got some unique grant funding to allow us to do it all inside the university. It’s been a new experience both for the university and for us as researchers.” Clinical trials at Stanford have begun in patients with solid tumors, to measure tumor response and safety, and trials with AML patients will begin towards the end of 2015.

Their work is part of a collaborative effort in hematology that crosses the campus and also includes researchers at the pioneering AML Working Group in the UK. “We established the CD47 Disease Team as an integrated program, with a highly collaborative group of scientists and clinicians working to bring CD47-targeted therapies to patients,” says Majeti. The team includes important clinical and research partners in the US and the UK – from investigators who see patients in clinical trials, to leading drug development experts. The CD47 Disease Team was awarded funding for this groundbreaking translational research from the California Institute for Regenerative Medicine

The targeted antibody treatment for AML is an outgrowth of research by Majeti with Irving Weissman, MD (professor, Pathology), a leading investigator in the biology and translational applications of normal blood stem cells and cancer stem cells. In the 1990s, Weissman helped develop the then-controversial theory that there are specific cancer stem cells within the whole tumor or cancer mass, and that those cells drive the growth of the cancer. He hypothesized that molecules only found on the cancer stem cells could be targeted by antibody therapeutics to eliminate cancer and cure patients. In parallel, Weissman identified a method to remove and purify the patient’s own blood-forming stem cells and return them to the body, purged of cancer, so patients could rebuild their blood supply from scratch.

Further research by Majeti, along with Weissman and a team of researchers in 2012, set the stage for a new understanding of how AML develops. Their research proved that leukemia develops from mutations that occur in blood stem cells and also determined the order in which such mutations occur. Weissman directs Stanford’s Institute for Stem Cell Biology and Regenerative Medicine and the Ludwig Center for Cancer Stem Cell Research and Medicine. Majeti, who is also a member of the Stanford Cancer Institute and the Institute for Stem Cell Biology and Regenerative Medicine, pointed out that having the correct model of how leukemias arise is important because it may eventually help determine what kind of therapy might be most effective.

 

This has led to the establishment of the Translational Program in Hematologic Malignancies that brings together interested researchers from across the campus to better understand these malignancies and make progress towards treatment. The team incorporates clinical investigators enrolling patients in clinical trials, along with laboratory scientists in other areas of cancer research. “We established this program as an integrated effort with a highly collaborative group of scientists to focus on problems related to hematologic malignancies and to accelerate improved treatment for cancers of the blood,” says Majeti.

Other research at the Majeti lab is also aimed at understanding AML and moving towards new treatments. One project is looking at recurrent genetic mutations in AML to determine their role in the development of the disease and to create new therapies based on that understanding. Another project isolates residual blood-forming stem cells in bone marrow samples from patients at the time of AML diagnosis to demonstrate that not all leukemic mutations are contained in the residual cells. That investigation could support the hypothesis that mutations must be serially acquired in clones of blood-forming stem cells, and it could help identify the source of relapse that causes significant mortality in AML.

One of the unique things about academic medicine is that physician scientists can be actively involved in clinical care as well as research and drug development. “It is incredibly motivating to interface with patients and their families and to move towards a therapy that we hope will eventually improve their lives,” says Majeti.

“My long-term goal is to make an impact on outcomes of patients with AML. That’s my personal mission and vision,” Majeti says. In 2015, he was awarded the prestigious Leukemia & Lymphoma Society Scholar Award, with five years of support for his original investigations that could be translated into improved treatments and cures for patients with hematological cancers. Majeti concludes: “We are treating patients with the same crappy chemotherapy drugs that were being used in the 1980s, and patients don’t do well. We need to bring new approaches to our patients.”

This has led to the establishment of the Translational Program in Hematologic Malignancies that brings together interested researchers from across the campus to better understand these malignancies and make progress towards treatment. The team incorporates clinical investigators enrolling patients in clinical trials, along with laboratory scientists in other areas of cancer research. “We established this program as an integrated effort with a highly collaborative group of scientists to focus on problems related to hematologic malignancies and to accelerate improved treatment for cancers of the blood,” says Majeti.

Other research at the Majeti lab is also aimed at understanding AML and moving towards new treatments. One project is looking at recurrent genetic mutations in AML to determine their role in the development of the disease and to create new therapies based on that understanding. Another project isolates residual blood-forming stem cells in bone marrow samples from patients at the time of AML diagnosis to demonstrate that not all leukemic mutations are contained in the residual cells. That investigation could support the hypothesis that mutations must be serially acquired in clones of blood-forming stem cells, and it could help identify the source of relapse that causes significant mortality in AML.

One of the unique things about academic medicine is that physician scientists can be actively involved in clinical care as well as research and drug development. “It is incredibly motivating to interface with patients and their families and to move towards a therapy that we hope will eventually improve their lives,” says Majeti.

“My long-term goal is to make an impact on outcomes of patients with AML. That’s my personal mission and vision,” Majeti says. In 2015, he was awarded the prestigious Leukemia & Lymphoma Society Scholar Award, with five years of support for his original investigations that could be translated into improved treatments and cures for patients with hematological cancers. Majeti concludes: “We are treating patients with the same crappy chemotherapy drugs that were being used in the 1980s, and patients don’t do well. We need to bring new approaches to our patients.”

Reducing the Pain of Writing Metadata

Baldeep Singh, MD, with staff at Samaritan House

Mark Musen, MD, PhD, and Scott Delp, PhD

Reducing the Pain of Writing Metadata

Mark Musen, MD, PhD, and Scott Delp, PhD

Reducing the Pain of Writing Metadata

Over the past four or five decades, scientific journals have tended to limit article length, especially the methods. Methods sections have been reduced in several ways, including font size and word limits, as well as banishment to online-only supplements. As a result, it has become increasingly problematic for one group of investigators to replicate the findings of another; one recent study in psychology reported success in only 36 of 100 attempts, as was reported in the August 2015 issue of Science. Even when investigators make their experimental data publicly available—as federal funding agencies require—other investigators may be stymied in their attempts to make sense of the data or to re-analyze the data in any meaningful way.

And yet, “the scientific method requires nothing less than that experiments be reproducible and that the data be available for other scientists to examine and reinterpret.” That sentence, borrowed with the permission of Mark Musen, MD, PhD (professor, Biomedical Informatics), from his article in the Journal of the American Medical Informatics Association in June 2015, is the foundation of his current effort to help authors create their metadata to annotate their scientific results.

Metadata is sometimes defined as data that describes other data. It is essentially the “methods section” that researchers must write so that others can use their data and thus reproduce their findings or build on them.

Once they have completed their experiments and amassed their data, however, the last thing workers in biomedicine want to do is go back to the beginning and explain step-by-step the process they followed and the details of the investigation. Musen and his colleagues aim to help ease the chore of writing metadata because, as he says, “people hate to author metadata.”

The Center for Expanded Data Annotation and Retrieval (CEDAR) was created to develop computer-assisted approaches to overcome the impediments to creating high-quality biomedical metadata. CEDAR is supported by the Big Data to Knowledge Initiative of the National Institutes of Health, with the goal of developing new technology to ease the authoring and management of biomedical experimental metadata.

Over the past four or five decades, scientific journals have tended to limit article length, especially the methods. Methods sections have been reduced in several ways, including font size and word limits, as well as banishment to online-only supplements. As a result, it has become increasingly problematic for one group of investigators to replicate the findings of another; one recent study in psychology reported success in only 36 of 100 attempts, as was reported in the August 2015 issue of Science. Even when investigators make their experimental data publicly available—as federal funding agencies require—other investigators may be stymied in their attempts to make sense of the data or to re-analyze the data in any meaningful way.

And yet, “the scientific method requires nothing less than that experiments be reproducible and that the data be available for other scientists to examine and reinterpret.” That sentence, borrowed with the permission of Mark Musen, MD, PhD (professor, Biomedical Informatics), from his article in the Journal of the American Medical Informatics Association in June 2015, is the foundation of his current effort to help authors create their metadata to annotate their scientific results.

Metadata is sometimes defined as data that describes other data. It is essentially the “methods section” that researchers must write so that others can use their data and thus reproduce their findings or build on them. Once they have completed their experiments and amassed their data, however, the last thing workers in biomedicine want to do is go back to the beginning and explain step-by-step the process they followed and the details of the investigation. Musen and his colleagues aim to help ease the chore of writing metadata because, as he says, “people hate to author metadata.”

The Center for Expanded Data Annotation and Retrieval (CEDAR) was created to develop computer-assisted approaches to overcome the impediments to creating high-quality biomedical metadata. CEDAR is supported by the Big Data to Knowledge Initiative of the National Institutes of Health, with the goal of developing new technology to ease the authoring and management of biomedical experimental metadata.

Musen, who is CEDAR’s principal investigator, explains: “there is a compelling need to solve the metadata problem so that we can move on to the next way in which we can use computer-stored knowledge to drive biomedical investigation.  Ultimately, the goal is to replace the dissemination of scientific results in the form of prose journal articles with computer-interpretable information—allowing Google-like agents to ‘read’ the literature and to summarize what they find.”

CEDAR has embarked on this project by having groups of biomedical scientists create metadata templates and store them in a repository where other scientists can use all or parts of them to author their own metadata. Choosing template components that best match their needs, scientists then annotate their own data by filling in metadata acquisition forms from the repository. Once completed, the metadata will accompany the primary data to archives where other scientists will have access to them. These metadata will also remain in CEDAR’s repository, which will be analyzed repetitively to find patterns that will enable the tools for metadata acquisition to use predictive data entry (essentially pre-populating the acquisition forms with likely text), simplifying and speeding metadata authoring.

“Natural language can be very ambiguous,” says Musen, “and one of the challenges we have sometimes is to use the computer to clarify what an actual procedure is. Thirty years ago we had two oncologists at Stanford looking at a protocol that they had just written together. What was startling was that they couldn’t agree whether they were to first give the chemotherapy and then the radiotherapy, or first give the radiotherapy and then the chemotherapy. It wasn’t until they could see it in very clear terms in the computer system that they realized the text of the protocol that they had written together was ambiguous.”

Musen concludes: “This doesn’t happen very often but it points out the fact that there is a great advantage to the clarity that the computer system offers. It often obviates some of the problems that people run into when dealing with natural language.”  

Tools such as those developed by CEDAR not only will help make metadata more precise, but they also will make metadata more complete and more comprehensive. If online datasets can be made more understandable to both humans and computers, in the end we should expect nothing less than better science.

Musen, who is CEDAR’s principal investigator, explains: “there is a compelling need to solve the metadata problem so that we can move on to the next way in which we can use computer-stored knowledge to drive biomedical investigation.  Ultimately, the goal is to replace the dissemination of scientific results in the form of prose journal articles with computer-interpretable information—allowing Google-like agents to ‘read’ the literature and to summarize what they find.”

CEDAR has embarked on this project by having groups of biomedical scientists create metadata templates and store them in a repository where other scientists can use all or parts of them to author their own metadata. Choosing template components that best match their needs, scientists then annotate their own data by filling in metadata acquisition forms from the repository. Once completed, the metadata will accompany the primary data to archives where other scientists will have access to them. These metadata will also remain in CEDAR’s repository, which will be analyzed repetitively to find patterns that will enable the tools for metadata acquisition to use predictive data entry (essentially pre-populating the acquisition forms with likely text), simplifying and speeding metadata authoring.

“Natural language can be very ambiguous,” says Musen, “and one of the challenges we have sometimes is to use the computer to clarify what an actual procedure is. Thirty years ago we had two oncologists at Stanford looking at a protocol that they had just written together. What was startling was that they couldn’t agree whether they were to first give the chemotherapy and then the radiotherapy, or first give the radiotherapy and then the chemotherapy. It wasn’t until they could see it in very clear terms in the computer system that they realized the text of the protocol that they had written together was ambiguous.”

Musen concludes: “This doesn’t happen very often but it points out the fact that there is a great advantage to the clarity that the computer system offers. It often obviates some of the problems that people run into when dealing with natural language.”

Tools such as those developed by CEDAR not only will help make metadata more precise, but they also will make metadata more complete and more comprehensive. If online datasets can be made more understandable to both humans and computers, in the end we should expect nothing less than better science.