63 posts categorized "Mayes Connections"

05/05/2016

The European Biosimilars Market

In previous blogs, I have written about what biosimilars are and have speculated about the development of the US biosimilars industry. While the US biosimilars market may not fully mature for another decade, the European biosimilars market has been expanding for years. In this blog, I will analyze what the European biosimilars market looks like and whether it is a preview of the US biosimilars market.

The biosimilars market in Europe has existed now for over a decade. The first approved biosimilar was a somatotropin in 2006. Since then, Europe has seen 22 biosimilars emerge in several different drug classes (more information available here). With all the potential savings that biosimilars offer, 22 products may seem like a small number to be developed over a decade. Similar to the US, a barrier for biosimilars in Europe is the development pathway. The European Medicines Agency (EMA)which operates similarly to the FDA, provides the legal pathway for biosimilars. Unlike the FDA however, the EMA has made many different product specific guidelines which may be hindering the adoption of biosimilars. Additionally, the EMA does not provide pricing or reimbursement guidelines. This responsibility instead falls to each individual member state. Some member states like Germany and the UK have made pricing policies favorable to biosimilars. Germany has prescribing quotas and the UK has reimbursements for physicians. Others, like Italy and France, have policies that are less favorable in the form of universal price regulation. This regulation does not offer any incentive for the adoption of biosimilars. This mesh of product specific legal guidelines and member state specific pricing policies has slowed the biosimilars market in Europe.

Despite these challenges, biosimilars still offer great savings in Europe. Biosimilars can be as low as 30% less expensive than the European reference products. Even with the slow adoption, some reports show that biosimilars could save Europe as much as $33 billion by the year 2020. At first glance, it seems as though these savings would create an environment where all European member states would mimic the UK and Germany and embrace biosimilars. One might expect that over the next decade, biosimilar policy and use across Europe will look almost identical. History, however reveals that this may not be the case. It is no secret that biosimilars can bring savings to the health care, so countries that have not already adopted them are not guaranteed to adopt them in the future. What is likely is that biosimilars will continue to bring savings, but only to the handful of member states adopting favorable policies.

The European biosimilar experience is unlikely to be repeated in the US. The FDA may end up with a less cumbersome development pathway than Europe which could then lead to more rapid adoption. Individual states will also not have the same policy and reimbursement variability as European member states. In other words, policies in Pennsylvania will look similar to those in New York, while policies in the UK will continue vary greatly from policies in Italy. With that being said, the US biosimilars market will still be unpredictable. All Europe teaches us is that despite unique road blocks, biosimilars can still bring large saving to health care systems who accommodate them.

Robert Bond, PharmD '18

04/29/2016

What is Big Data and how will it revolutionize the health industry? Part II

Big Data is poised to revolutionize the healthcare industry. The revolution goes beyond just analyzing text based notes. It is being used in predictive analytics, prescriptive analytics, genomics, and in many other ways.

You may have heard the term “Internet of Things.” This refers to the fact that many devices are now connected to the Internet, from your phone to your car to wearables like the Apple Watch and FitBit. It is estimated that by 2020, there will be 25 billion connected devices.  These devices capture real time data, and allow for real-time alerts. They produce tons of data on the individual. In combination, they can provide us even more information on entire populations.

Big Data can fill in the blanks for predictive analytics, “the use of data, statistical algorithms and machine-learning techniques to identify the likelihood of future outcomes based on historical data.” Electronic Medical Records can be reviewed and analyzed. An individual patient generates much data which can be analyzed to make predictions on whether or not they will comply with their doctor’s recommendations. For example, one hospital found that patients who live in certain neighborhoods are likely to miss appointments. They concluded that it was actually cheaper to send them a taxi to bring them to the appointment than it was to deal with a missed appointment. This was determined by utilizing multiple data sources: patient data, neighborhood data, and administrative data.

Remember, these data are not all being collected by the researcher. They are being collected independently, and the researcher is able to query the different sources to make a prediction.

Prescriptive analytics are a goal of Big Data in healthcare–to be able to identify and predict the path of a patient, then intervene to set them on the right path. For example, if a patient is supposed to walk a certain number of minutes a day, their phone or wearable would be able to see, in real time, if they choose to do so. If the patient allows these data to be shared with their physician, the physician can connect with the patient and determine why they are not complying. This would allow for immediate interventions that were not possible before.

When a person uses their cell phone late at night, it may indicate they are having trouble sleeping, which their physician can then address. These are very simple examples, but they demonstrate how real-time data can be captured and used to nudge patients in the proper direction.

Genomics research is a third area of opportunity in Big Data. The cost of mapping out an individual’s genome has plummeted since the completion of the human genome project. The individual’s genome itself is a massive dataset. When you can compare the genomes of millions of people, you can gain insight into the effectiveness of medicines. We are already seeing a move towards personalized medicine, which will only be strengthened by the Big Data revolution.

Traditionally, an oncologist might find that patients of European descent respond differently from non-Europeans to a particular treatment, which can then be used to determine the first-line or second-line treatment for those subpopulations. Now, with genomic testing, oncologists can see that those with a particular genetic marker respond very well or not at all to a particular treatment. With rapid genomic testing, the oncologist can then use a patient’s genomic information to recommend the treatment most likely to be effective. We are now able to identify patient sub-populations based on genetic markers, which allows for targeted gene therapy, Think of the advances this will bring us in treating cancer or other devastating diseases.

As more genomic data are captured and compared, we will be able to make insights that were nearly impossible to make before. We can begin to see what was once invisible. The more data there are, the more insights we can glean.

When enough Big Data are available, the insights we will be able to make are beyond comprehension.   It is already transforming how we think of health and public health and it will continue to revolutionize healthcare for years to come.

 

Magdi Stino, Health Policy PhD Candidate

04/15/2016

Introduction to Biosimilars

As clinical guidelines are published and the pharmaceutical industry innovates, the practice of pharmacy changes. At the fore front of innovation is the biologic. These medications are capable of achieving clinical outcomes that traditional small chemical medications cannot. Biologics are not without disadvantages. In addition to being more challenging to create, these medications are also orders of magnitude more expensive. This cost has kept biologics as an alternative or not even an option to patients who would greatly benefit from their use. In order to solve this issue, the pharmaceutical industry is developing biosimilars, or medications that are similar to existing biologics and are offered at a lower cost.

Most medications that are found in a pharmacy are small molecule products. These drugs are synthesized chemically and have been relatively cheap to produce. When a small molecule drug is first offered on the market by a single proprietary manufacturer, they are known as brands drugs. The costs of these products are typically high for a regulated period of time, that proprietor is the only entity legally allowed to produce that drug. This creates a temporary monopoly, allowing the proprietor to sell at a price with no competition. After that period of time is up, other manufacturers are allowed to create what is known as generics. This system of proprietor creation and then generic competition is regulated by the Hatch-Waxman Act. This legislation has two goals. The first is to create an incentive for new drug creation by allowing innovator manufacturers enough of a monopoly to make a profit despite high research and design cost. The second is to lay down a framework where generics can come in and make prices reasonable for patients. When both goals are met, a balance is struck between continuing innovation and low drug costs. Hatch-Waxman has created a model that works well for the brand-generic model which for the time being describes the inventory of most pharmacies.

For biologics, this model cannot be applied. The reason for this difference is due to the method of how biologics are synthesized. Biologics are made from genetically engineered cells. These cells then create proteins which are then isolated. As you can imagine, this process is much more complex and difficult. The process for engineering these cells may be trade secrets which means non-proprietary manufacturers must come up with a different way to arrive at a similar protein. The fact that the active pharmaceutical ingredient is a protein makes replication more challenging for non-proprietary manufacturers. A protein’s function is in part derived from its tertiary structure, the way in which the protein is folded. Slight alterations in the amino acid chains which make up the protein could alter tertiary structure and therefore alter is function. In other words, the creation of generics for biologic proprietary medications are nearly impossible.

This is where biosimilars come into the picture. Biosimilars are highly similar medications to a reference product. A reference product is like the brand product of the original model. The sponsors of the biosimilar must go through a new legal route before they can market the product. Thus far, that legal pathway has been through the Biologics Price Competition and Innovation (BCPI) Act. This act is designed to work similarly to the Hatch-Waxman Act. The BCPI is not the complete story however, leaving the path biosimilars must go through unclear. The amount of research needed for a biosimilar and the time that it will take to develop a biosimilar for example are still unknown.

In conclusion, biosimilars are medications that are clinically similar to proprietary biologics and can be produced and sold at a lower cost. Due to the complex nature of biologics themselves, biosimilars will not be the new generics but could lower health care costs and bring innovative pharmacotherapy to more patients.

Robert Bond, PharmD '18

04/12/2016

What is Big Data and How Will it Revolutionize the Health Industry? - PART I

Big Data is one of those new terms that has been getting a lot of media coverage. If you’re like me, you have been confused by what it even means. The short answer is that Big Data is a new approach for organizing and analyzing the massive amounts of data being generated each day. Big Data allows for insights that were practically impossible under traditional approaches. We are at the doorstep of a revolution, yet we still haven’t maximized our potential with old techniques and approaches.

Before we dive into the future of Big Data, it helps to first realize how much data modern society is producing each day. Eric Schmidt of Google noted that “from the dawn of civilization until 2003, humankind generated five exabytes of data. Now we produce five exabytes every two days…and the pace is accelerating.” (How much is an exabyte?).  More recently, it was estimated that we produce two and a half quintillion terabytes of data every day as of 2012. These include everything from your credit card purchases, to the photos you take on your phone, to your social media posts. Everything is being digitized and ubiquitously captured and more data are being produced constantly. Every phone call you make is recorded. Every song you play in iTunes is documented somewhere.

We have an astounding volume, variety, and velocity of data–“the three Vs.” This is where Big Data comes in. Big Data is a new approach to storing, reading, and analyzing these data, which are distributed over many different platforms, and are not standardized. This differs from the traditional mode of data analysis.  The traditional approach has been to organize and build what are called “relational databases,” then apply statistical analysis methods to answer specific questions–often the databases are built for the purpose of answering those specific questions.

A relational database is simply a set of data tables, each made up of rows and columns, which are joined together by one or more columns used as an identifier. For example, if you have a student ID card, then the university has a table of all student IDs, with personal information about each student. Then, there would be another table, say one with course registrations by student. Every time you register for a course, a new row is created with your student ID and the course number. Because each table makes use of your student ID as an identifier, an analyst can find your information from each table–to create a class roster, say, or to print out your schedule for this semester. We can find all the student ID numbers registered for a particular course, then find information on each student from the other table.

A shopper rewards program works the same way. One table records your reward number and all your personal information. Companies can use what are called data mining techniques on this database to encourage more sales. For example, retailers already send catalogs and specials to their customers. If they know your shopping history, they can customize the mailers they send you to highlight items you are more likely to buy. Even just knowing the gender of the customer allows them to segment their advertisements, and get a better return on investment. The more they know about your preferences, and the preferences of people like you, the better they can customize their engagement with you.

But, even with sophisticated techniques like data mining, and with massive transaction databases, we are still not in the world of Big Data. The examples I just gave are part of the traditional approach. The tables are organized in advance, data are captured and recorded neatly in the tables, and normal methods of analysis are used. This is not Big Data–this is just lots of data.

Big Data, unlike this traditional approach, does not need to use relational databases in its analyses. The data are not “collected” in the same way. Oftentimes, the data are being collected (or archived) without the intent of analyzing them later.  Big Data does not have any structure. Data do not have to be neatly organized in tables with rows and columns like relational databases.

Nearly everything we do in modern society leaves a digital footprint. Big Data allows us to use and analyze these data by applying specific techniques.  Primarily, Big Data makes use of Hadoop for faster file storage and data retrieval. Hadoop, an open source architecture developed by Yahoo, based on research conducted by Google, is the primary Big Data tool. Hadoop uses a distributed filing system where raw data are saved across multiple nodes, using a single hierarchy of directories, usually saved in 64 MB chunks. The data are not cleaned or organized in any way, and no business rules are applied. The data are not transformed. Big Data, using Hadoop, allows users to query those data and gain meaningful insights. Facebook, as an example, uses Hadoop to store the massive data generated by its users every single second.

Practitioners of Big Data believe in the “sushi principle”; that is, data should be raw, fresh, and ready to consume. Don’t cook the data! Keep it in its raw form. 

Because Hadoop is open source, and runs on commodity hardware rather than specialized hardware, it is much cheaper and simpler to store data than traditional methods. However, the difficulty arises in later querying and analyzing the data.

Whereas before, specialists were required to build the data sets, create the schema, and capture the data in a consistent way, Big Data eliminates these required skills at the front end, since Hadoop standardizes the approach to storage

Big Data requires expertise and creativity in the querying end. Querying can be complicated, since the data are being retrieved from multiple sources, which are not organized in a standardized way. SQL is becoming the standard querying language in Big Data, as it has been in traditional relational databases.

Because it is so new, it has been said that the only people with 10 years experience in Hadoop, are the men who developed it in the first place.

This provides a huge opportunity for data scientists in the future, and Big Data will surely create a huge demand for analysts who can work within the architecture.

Magdi Stino

Health Policy PhD Candidate

04/07/2016

Policies to Help Stabilize Rising Drug Costs

In the last blog I reviewed drug pricing terminology between the wholesaler and pharmacies. In this blog I will review how this process can lead to increasing drug costs. I will also two discuss public policies that have been implemented to try to stabilize that trend.

As mentioned previously, pharmacies are reimbursed at a discounted Average Wholesale Price (AWP). Pharmacies can seek deals with wholesalers to buy drugs at the Wholesale Acquisition Cost, WAC, or Average Manufacturer Price, AMP. Pharmacies will then make a profit by selling at around AWP. In other words, a pharmacy’s profit can be represented as AWP minus either WAC or AMP. Pharmacies can increase their profits by selling drugs with higher AWPs. Knowing this, manufacturers may attempt to set a higher AWP.  Since pharmacies are drawn to higher AWPs and purchase from wholesalers, wholesalers will carry the highest AWP drugs they can to satisfy the pharmacies they sell to. This relationship is similar to that of the manufacturer and the wholesaler. The manufacturer will create the highest AWP possible to attract the wholesalers who are going to buy at lower price like AMP anyway. Because wholesalers do not purchase drugs at AWP, and instead purchase drugs at a lower price, an increase in the AWP will not cause them to search for other manufacturers. This means that manufacturers can increase AWP while simultaneously satisfying the needs of pharmacies, making their product more attractive to wholesalers without losing business to competing manufacturers. AWP has a natural tendency then to increase because pharmacies want it higher and manufacturers can increase it without the risk of losing business from wholesalers.

Increasing AWP does increase costs to insurers and taxpayers. Since the government does serve as an insurer, it has put policies in place to prevent increases in AWP from bankrupting them. The federal government does this by imposing a Federal Upper Limit, FUL. This limit is the maximum price at which Medicaid will reimburse a pharmacy for a drug. In order for a drug to qualify for an FUL price, it must have at least three equivalent products made by three different manufacturers. To put is simply, if “Drug X” had an FUL price, it must have three different therapeutically equivalent generics that are made by at least three competing manufacturers. The FUL prices is set at 150% of the cost for the cheapest equivalent drug. If “Drug X” is made by manufacturers 1, 2 and 3, and the cheapest price is from manufacturer 1 at $100 dollars, then “Drug X”’s FUL price is $150. These strict qualifications means that some drug do not have an FUL price and of those that do, paying 150% for the cheapest generic may not produce any savings. In order to make up for these limitations, some states have created Maximum Allowable Costs or MAC (more information available here). MAC was designed to operate as a continuation of FUL to further increase savings but at a state level. MAC prices are uniquely set by each state and do not have strict rules for establishing what drugs qualify for MAC and what the price ceiling should be. This has created variation between states with some states achieving more drugs that qualify and more aggressive price ceilings than others. Whether or not these MACs were worth the resources put into their creation is something that remains to be seen.

In conclusion, both state and the federal government have created policies to the curb the natural tendency for AWP to rise. The federal government first created the Federal Upper Limit, or FUL, and states later created Maximum Allowable Costs, or MAC based up the FUL. The FUL has severe limitations in the form of drug qualifications that are too strict and a 150% price ceiling that can be ineffective. The MAC on the other hand may be a step in the right direction. Since the MAC is based off of and shares similar limitations to the FUL, its effectiveness remains to be seen. Moreover, while FUL and MAC may be effective in some situations, they alone are not enough to prevent increasing drug costs.

Robert Bond, PharmD '18

 

Usciences Research Gains Traction in Men's Health

USciences’ motto is “proven everywhere.” One reason why the “proven everywhere” motto makes sense for USciences is because we teach students, and professors themselves use scientific research as the basis for teaching and scholarship. One such area is the Health Policy Program at Mayes College of Healthcare Business & Policy. Health policy is the investigation of problems in health (not just healthcare and its delivery) in its broadest sense using scientific methods of study to develop evidence-based recommendations for changes and innovations in policy. One challenge is that policymakers sometimes eschew data and evidence when making policy; rather, they are sometimes drawn to its opposite – anecdotes – heart wrenching stories from constituents.

When data and evidence alone fail to inform policy, another option that is available is to make the best possible case for particular policies using the force of ethical argumentation. In this regard, evidence and data receive bolstering through analysis of the very values that undergird health and provide exhortation for particular policy approaches. This is the case with some recent work undertaken by Health Policy Ph.D. candidate Janna Manjelievskaia, MPH and Visiting Assistant Professor, David Perlman, Ph.D.

Janna was working with colleagues on a paper examining the policy issues associated with the current U.S. Preventive Services Task Force (USPSTF) recommendations against testicular cancer screening. She suggested to her colleagues that perhaps the paper could be enhanced with an ethical angle. She asked Dr. Perlman, one of her professors who focuses on ethics in health policy and public health, to join in writing the paper, which was recently published in the American Journal of Men’s Health and presented at their conference. The lead author of the paper, Michael Rovito, Ph.D., an Assistant Professor at the University of Central Florida, was recently was interviewed by STAT about the importance of testicular self-examination. The paper, and the power of its ethical argument and coupled with careful, scientific examination of policy, are gaining traction with policymakers, which should hopefully result in a policy change by the USPSTF to change its current recommendation against testicular cancer screening. When that happens, it will be yet another instance of how USciences research and students are “proven everywhere.”

David Perlman, PhD

Janna Manjelievskaia, MPH

02/15/2016

Turing Pharma and the drug price debate

In September 2015, Turing Pharmaceuticals raised the price of Daraprim® (pyrimethamine) from $13.50 per tablet to $750.    The 5,000 % price increase put the national spotlight on the practice of rising drug prices, including for generics such as Daraprim, which is used to treat toxoplasmosis infections.  The company’s CEO, Martin Shkreli, has been vilified in the press and in social media, which has kept the subject in the public consciousness.  The price hike by Turing, which had only acquired the drug that August, was seen by many as price gouging.  Shkreli’s action has heightened scrutiny of drug pricing policies, and raised public awareness of the arbitrariness of pricing in the US market.

Shkreli defended the move, contending that the higher profitability means more funding will be available for toxoplasmosis research.  If the market size is limited, then profitability would come through higher margins.  The higher margins, Shkreli’s camp argues, compel manufacturers to produce enough of the drug to meet demand.  No one who needed the drug has been unable to get it due to price, the company has claimed, and Turing provides support for those who are unable to afford it.

Turing also argues that it has been singled out for media attention for the practice of acquiring a drug and raising its price.  Older drugs, such as Cycloserine®, a tuberculosis drug, and Doxycycline®, an antibiotic, have had their prices raised after acquisition.  Valeant purchased Isuprel® and Nitropress® and subsequently raised the price for both.  CBS News reported that a Bloomberg News study found that 20 prescription medications have had their prices quadrupled since 2014, and 60 drugs have had their drug prices at least doubled.  The same study found that Novum Pharma raised the price of two anti-inflammatory steroids, Alcortin A® and Novacort®, by 2,000 and 3,000 percent respectively during that period.  In this light, Turing’s move was not completely out of the norm.

Critics counter that such dramatic price increases are dangerous.  The Infection Disease Society of America (IDSA) and the HIV Medicine Association (HMA) sent a letter to Turing warning that the practice was “unsustainable for the health care system”  and posed a risk to public health.  Social media has lambasted Shkreli for exploiting people with serious illnesses for the sake of profitability.  Shkreli, his critics contend, is behaving more like a hedge fund manager than as a steward of the public health.

Since this story broke, more attention has been given to the practice of hiking drug prices such as at Gilead.  The company is facing scrutiny for the cost of two hepatitis C medications, Sovaldi® and Harvoni®, which cost $84,000 and $94,500 respectively for each regimen.  The attorney general of Massachussetts wrote a letter to Gilead,  saying she was looking at whether drugs are overpriced, and whether to invoke consumer protection laws in that effort, which would be a first.   Pfizer has faced criticism for hiking the prices of 100 drugs in the beginning of January 2016 and its planned merger with Allergan has emerged as an issue for Democratic presidential candidates Hillary Clinton and Bernie Sanders.  Truveris, a research firm, found that drug prices had gone up by 10.4% since 2014, and that brand names have gone up by fifteen percent.

While Shkreli and Turing’s move was jarring, the public now knows that price increase are not extraordinary.  New specialty drugs entering the approved drug market are further driving up prices.  BY contributing to higher healthcare costs, price increases for existing drugs reduce the apparent effectiveness of health care spending. Prescription drugs make up a substantial portion of US national healthcare costs, which are highest in the world.   The price increases lead to higher insurance payments and copays.

An investigation has been opened by the House Committee on Oversight and Government Reform.  Ranking member, Rep. Elijah Cummings, released memos from a preliminary investigation which appeared to show that the price moves by Turing were purely for increasing profits, and that R&D costs were minimal.  Shkreli testified before the House Oversight and Government Reform Committee on February 4, 2016, where he repeatedly invoked his Fifth Amendment right not to self-incriminate.

Shkreli was arrested in December 2015 for allegations of securities fraud prior to his takeover of Turing and has continued to draw public notoriety for his actions.  He has, through his public defiance, caused the discussion to persist.  It is notable that national vilification of Shkreli has contributed to the debate, demonstrating that personalities can impact policy as much as raw numbers.

Affordable medications are a crucial part of public health.  Likewise, the profit motive provides an incentive for pharmaceutical companies to produce sufficient quantities of a particular drug.  In the current system, a balance is needed.  Turing’s price hike of Daraprim has heightened the scrutiny of a perceived imbalance towards profitability and away from affordability.  With drug prices entering the presidential elections, it is likely that pricing will continue to be a major political issue.

Magdi Stino, PhD Candidate, Health Policy

11/21/2014

Medication Adherence In Patients with Depression

Depression is a mental disorder that has an unknown cause. There are many explanations for developing depression including genetic, biological, environmental, and psychological features. Signs and symptoms of depression vary from minimal to severe. Indication that someone may need medication to regulate his or her mood include the following symptoms: persistent sadness, hopelessness, fatigue, irritable mood, loss of interest, feelings of guilt, difficulty concentrating, insomnia, and suicidal thoughts.

There are a variety of classes of medications used for depression, but they all need to be given an adequate trial of about twelve weeks to see if the medication is efficacious. Roughly fifty percent of patients prematurely discontinue antidepressant therapy.  There are serious outcomes if medication is not taken including suicide. A systemic review by Chong evaluated the impact of education and behavioral interventions on antidepressant medication adherence and depression disease progression. This review showed patient education alone did not improve medication adherence rates; however, when used with behavioral changes and multifaceted interventions, adherence rates and depression outcomes improved. Behavioral and multifaceted interventions include education, telephone follow-up, medication support, and communication with primary care providers. For this reason, it is crucial to have pharmacist intervention when dealing with antidepressants to provide proper counseling on the medication to lead to better insight on the medication as well as intervene on proper behavioral changes.

Pharmacists can help increase outcomes of depressed patients by counseling them on their medication. Antidepressants are different than other medications because they need a longer period of time to feel it working. This presents as an issue for patients because they do not feel the need to take a medication that is not helping them feel better instantaneously. Also, patients might think they do not need a medication if they are starting to feel better.  Pharmacists should explain to the patient that it takes antidepressants at least two weeks to take effect. Patients should also be informed that there are common side effects associated with these medications and it is important to continue taking antidepressants for at least six to nine months to prevent reoccurrence of depression.

Because there are many negative side effects of depression, it is important to manage it with appropriate medications. Due to their expertise on antidepressants, pharmacists can counsel patients on what to expect, the onset of action, and duration of use for these medications. Through patient education, behavioral changes, and multifaceted interventions patients can have better outcomes for their depression.

Urvi Patel, PharmD 2016

11/18/2014

Philly Issues First ‘Code Blue’ of Season: USciences Prof Explains What That Means to Homeless Population

MetrauxThose fortunate enough to be somewhere warm during this recent cold snap might wonder how Philadelphia’s homeless population can survive the frigid outdoor conditions, said Stephen Metraux, PhD, associate professor of health policy and public health at University of the Sciences.

“Data on how many homeless are stuck out in the cold is difficult to come by,” said Dr. Metraux. “The homeless population is notoriously challenging to count, as they usually strive to stay inconspicuous amidst the public spaces to which they are relegated.”

The best available number comes from the City of Philadelphia’s annual “Point in Time” count, when teams of volunteers canvass the shelters and streets and count the number of homeless people and families that they encounter. Of the estimated 5,500 homeless individuals counted on a January night in 2013, 388 of them were unsheltered. That figure is down from the 526 individuals recorded on a January night in 2012.

So how do these hundreds of homeless individuals survive below-freezing conditions? The first line of defense is provided by the City of Philadelphia, which implements a “Code Blue” on any night when temperatures fall under or hover around 20 degrees, said Dr. Metraux.

On Code Blue nights, outreach workers and police can bring in any homeless person to stay in a shelter or other public facility temporarily designated for overnight accommodations. No one gets turned away and, if necessary, a Court Ordered Transportation to Shelter can be quickly obtained to bring resistant individuals indoors on occasions when leaving a person outdoors may subject him or her to danger from the cold weather elements.

“Code Blue’s success is best indicated by the rarity of hypothermia deaths among homeless persons in the past few years,” said Dr. Metraux. “But Code Blue is at best a stopgap solution; the best solution for protecting the homeless from the elements is through renewed efforts to reduce homelessness.”

The number of homeless people living on the streets across Philadelphia has been declining over the previous few years, due largely to innovative programs that engage the most recalcitrant homeless persons and provide them with housing and services, he said. 

10/30/2014

Chronic Medication Adherence: Diabetes

Diabetes is a chronic disease that affects millions of Americans. It is a group of metabolic disorder characterized by persistent hyperglycemia. Early diagnosis and proper treatment is important to reduce complications such as coronary artery disease, blindness, and loss of sensation. However, as stated by Dr. C. Everett Koop “Drugs don’t work in patients who don’t take them.”

Diabetes is one the leading causes of death. It is very important to eat a healthy diet especially with this disease. Patients who have diabetes should be encouraged to modify their diet to include more vegetables, whole grains, fruits, non-fat dairy products, beans, lean meats, poultry, and fish. A study done to see the correlation of self-monitoring of blood glucose to weight loss by doing a weight loss intervention showed increased self-monitoring of blood glucose and greater weight loss was achieved through better adherence to diet. The authors concluded self-monitoring of blood glucose leads to increased adherence to diet. Also, patients that were educated about the impact of diet on weight loss, showed increase adherence to diet and better glucose control.

A retrospective literature search was conducted by Cramer to assess the adherence to oral hypoglycemic agents and insulin products and its effect on glycemic control in diabetes patients. In this systematic analysis, she found that electronic monitoring was effective in identifying patients who were poorly adherent.   The study showed that electronic monitoring systems can be used to increase adherence by providing health care providers the information needed to identify patients than need interventions.

Pharmacists today use electronic monitoring through computerized programs that measure adherence rates such as refill rates. There are many ways pharmacists can intervene to increase adherence rates with chronic medications, such as oral hypoglycemic medications. Non-adherence can be detrimental to patients, so pharmacists can intervene by counseling newly diagnosed individuals of the benefits of taking their medication properly and the risks that may occur if medication is not taken. Also, pharmacists can review adherence rates with patients to identify reasons why patients may not be taking their medications. If patients cannot tolerate certain medications, or cannot follow directions appropriately, pharmacists can suggest other products.

The cost of not taking medications is high in patients with chronic medications, so it is important that pharmacists and patients work together to create a regimen that is most beneficial.

 

Urvi Patel, PharmD’16

© 2011 University of the Sciences in Philadelphia • 600 South 43rd Street • Philadelphia, PA 19104 • 215.596.8800