Society needs to address ethical, moral issues of genetic testing soon

The University Record, October 29, 1997

By Deborah Gilbert
News and Information Services

Genetics is a field fraught with great promise and risk, according to Elizabeth M. Petty. A genetics researcher and assistant professor of internal medicine and human genetics, Petty gave a historical overview of the field of genetics at the School of Public Health’s Alumni Day Oct. 17, drawing on statements published over the last 100 years that demonstrate both the dangers and opportunities inherent in the field.

Petty’s topic was apt, noted Dean Noreen M. Clark, “because the School is the nation’s first to offer an interdepartmental concentration in public health genetics.”

Petty explained that in the late 19th century, eugenicists, not conscious of the devastating social implications of their stance, boldly defined their field as understanding inheritance for “race betterment.” That view of eugenics remained in place to a greater or lesser degree until after World War II, by which time the Nazis and the Holocaust had taught the world the consequences of “racial betterment.”

Today, geneticists and genetic counselors in the United States, well aware of the potential for misusing genetic information for the formation of government policy, focus on providing individuals with genetic counseling and education so that they can plan their own lives appropriately and make their own reproductive decisions on an informed basis.

Not all nations operate that way, however. In China, couples with genetic diseases are not allowed to marry unless they sign a contract saying they will be sterilized or engage in long-term birth control.

Some key questions recur in the United States, Petty said, such as, “When is it appropriate to test? For instance, is it appropriate to do prenatal testing for Huntington’s disease, which is not currently treatable but does not generally become symptomatic until people are between 30 and 50 years of age?”

Also, when, if ever, should testing be mandatory? In the 1960s, testing for PKU (a rare inherited metabolic disorder) in newborns became mandatory in some states. “In this instance, it seems appropriate, because PKU can be treated if it is caught early, and if not, leads to significant mental retardation,” she said.

On the other hand, for several years in the 1970s, some states mandated testing of African Americans to determine if they carried the gene for sickle cell anemia. “However, there was not always appropriate counseling provided to those who tested positive, the test results were accessible to employers, and a number of carriers who were healthy themselves lost their jobs. This was horrendous public policy,” Petty said.

Petty noted that genetic testing today doesn’t all take place in the laboratory. “It is a process,” she said, that includes analyzing family history, a physical assessment of the individual and interpretation of DNA tests, followed closely by education, counseling and, as necessary, support and management for the patient.

“Genetic susceptibility testing doesn’t tell when a disease will strike or how severe the symptoms may be,” she added, “nor does it always predict the exact degree of risk.” It also does not account for the role of environment or of individual health behavior, which complicate predictions for an individual’s future health.

With the advent of increasingly sophisticated technology, DNA testing for genetic diseases is becoming less costly and more easily accomplished, Petty said, so society at large must examine the thorny ethical and moral issues involved, and help address appropriate public policy soon.

Tags:

Leave a comment

Commenting is closed for this article. Please read our comment guidelines for more information.