Question: What led employers to start drug testing in the first place?
Drug testing got its start shortly after the Vietnam War. In 1971, President Richard Nixon directed the military to initiate a urine drug testing program. This newly formed program yielded a disturbingly high positivity rate among military personnel returning from Vietnam. In 1982, the Department of Defense formally defined forensic drug testing requirements and the Army, Navy and Air Force established panels of active duty scientists for the development and implementation of forensically sound drug testing procedures.
In the mid-1980s, Quest Diagnostics began performing employer drug tests, and shortly thereafter released the first Drug Testing Index (DTI), which examined positivity rates to provide a comprehensive analysis of drug-use trends. The first DTI was published in January 1988 and highlighted a drug test positivity rate of 13.6%. That same year, Congress passed both the Drug-Free Workplace Act and the Anti-Drug Abuse Act in response a mounting number of drug and alcohol related accidents, and one fatal train accident in particular in which the train’s operator tested positive for marijuana. The new federally-mandated drug testing guidelines, a growing network of testing providers, and new data from sources like the DTI started a trend towards testing among private sector employers.
Due to increased awareness surrounding drug testing, it was necessary for state regulatory agencies to establish guidelines for testing. In 1991 the Omnibus Transportation Employee Drug Testing Act was passed, requiring the drug and alcohol testing of U.S. Department of Transportation applicants and employees. State agencies regulating worker’s compensation and unemployment benefits stepped in to create rules regarding the handling of claims when employees failed drug tests, and offered discounts on annual insurance premiums if drug-free workplace programs were in place. As a result, there was a significant increase in workplace drug testing programs.
Over the past 25 years, drug test positivity rates have continually declined among employers with a drug testing program in place. From a high of 13.6% in 1988, down to 3.5% in 2012, this downward trend suggests that drug testing effectively discourages drug use among applicants and employees. Simply having a program in place seems to lead to lower drug test positives as drug users seek out employers who do not drug test. According to the Substance Abuse and Mental Health Services Administration (SAMHSA), nearly 7% of adults employed full-time and 9% of those employed part-time currently use illegal drugs. The U.S. Department of Labor states that more than 60% of adults know someone who has come to work under the influence of alcohol or other drugs. These statistics suggest drug use at a higher rate than is detected among our drug testing clients, again lending credibility to the fact that drug testing programs deter drug use.
For more information about drug testing, visit our website.
Question: What led employers to start drug testing in the first place?
Drug testing got its start shortly after the Vietnam War. In 1971, President Richard Nixon directed the military to initiate a urine drug testing program. This newly formed program yielded a disturbingly high positivity rate among military personnel returning from Vietnam. In 1982, the Department of Defense formally defined forensic drug testing requirements and the Army, Navy and Air Force established panels of active duty scientists for the development and implementation of forensically sound drug testing procedures.
In the mid-1980s, Quest Diagnostics began performing employer drug tests, and shortly thereafter released the first Drug Testing Index (DTI), which examined positivity rates to provide a comprehensive analysis of drug-use trends. The first DTI was published in January 1988 and highlighted a drug test positivity rate of 13.6%. That same year, Congress passed both the Drug-Free Workplace Act and the Anti-Drug Abuse Act in response a mounting number of drug and alcohol related accidents, and one fatal train accident in particular in which the train’s operator tested positive for marijuana. The new federally-mandated drug testing guidelines, a growing network of testing providers, and new data from sources like the DTI started a trend towards testing among private sector employers.
Due to increased awareness surrounding drug testing, it was necessary for state regulatory agencies to establish guidelines for testing. In 1991 the Omnibus Transportation Employee Drug Testing Act was passed, requiring the drug and alcohol testing of U.S. Department of Transportation applicants and employees. State agencies regulating worker’s compensation and unemployment benefits stepped in to create rules regarding the handling of claims when employees failed drug tests, and offered discounts on annual insurance premiums if drug-free workplace programs were in place. As a result, there was a significant increase in workplace drug testing programs.
Over the past 25 years, drug test positivity rates have continually declined among employers with a drug testing program in place. From a high of 13.6% in 1988, down to 3.5% in 2012, this downward trend suggests that drug testing effectively discourages drug use among applicants and employees. Simply having a program in place seems to lead to lower drug test positives as drug users seek out employers who do not drug test. According to the Substance Abuse and Mental Health Services Administration (SAMHSA), nearly 7% of adults employed full-time and 9% of those employed part-time currently use illegal drugs. The U.S. Department of Labor states that more than 60% of adults know someone who has come to work under the influence of alcohol or other drugs. These statistics suggest drug use at a higher rate than is detected among our drug testing clients, again lending credibility to the fact that drug testing programs deter drug use.
For more information about drug testing, visit our website.