Sunday, November 30, 2014

The Clean Air Act

The Clean Air Act passed in 1963 asserts that the Environmental Protection Agency must “update air quality standards every five years, to ensure standards "protect public health with an adequate margin of safety" based on the latest scientific evidence.” According to the Environmental Protection Agency Commissioner Gina McCarthy  the current proposal,  “would lower the current standard of 75 parts per billion (the concentration of ozone pollution in the air we breathe) to a standard in the range of 65-70 parts per billion, while taking public comment on a level as low as 60.”

Of all the standards we studied in this course, relatively few standards had a clause to update the requirements every five years. Especially standards that apply to so many industries including energy, manufacturing, automotive, etc. An aggressive across the board standard like the one regulating air quality standards push industries to innovate and reevaluate their practices. As we have also seen in class the more aggressive the standard is, the more noticeable the change or influence of the standard is after the fact.

Hopefully with this updated air quality standard we will see significant changes. Air quality effects the health of millions of Americans in ways many of us are unaware of. From asthma to drowsiness, even possibly cancer, these are all potential symptoms of dangerously high air quality levels. Commissioner McCarthy describes the economic effects related to these potential health hazards:  Missing work, feeling ill, or caring for a sick child costs us time, money, and personal hardship. When family health issues hurt us financially, that drags down the whole economy. The good news is that if these proposed standards were finalized, every dollar we would invest to meet them would return up to $3 in health benefits (totaling up to $38 billion in 2025, and going up from there).For our children, that means avoiding up to 1 million missed school days, thousands of cases of acute bronchitis, and nearly a million asthma attacks. Adults could avoid hundreds of emergency room visits for cardiovascular reasons, up to 180,000 missed work days, and 4 million days where people have to deal with pollution-related symptoms.”


I hope that the EPA continues to implement aggressive environmental standards across the board and that eventually the United States can become a leader in sustainable and clean energy. The economic and health hazards of high ozone pollution effects every individual and corporation around the world. By setting this standard, the Environmental Protection Agency is also pushing corporations to become more efficient and innovative with their energy consumption and emissions. Throughout this course we have also seen the powerful effects of standards including safety improvements and innovative changes. I believe that if the Environment Protection Agency continues to implement and enforce these aggressive standards we will see some great changes including safety improvements for the masses and energy efficiency advances. 

Future Plans of the FDA in the Pharmaceutical Manufacturing Industry

The main regulatory body at large in the pharmaceutical manufacturing industry is the Food and Drug Administration (FDA). The FDA sets de jure standards responsible for protecting and promoting public health through the regulation and management of food safety, pharmaceutical drugs, and many other fields relating to public health. The main body of the FDA responsible for the development and approval of drugs is the Center for Drug Evaluation and Research (CDER). The goal of CDER is to ensure that drugs marketed in the United States are safe and effective. CDER recently published a strategic plan to discuss the steps it will be taking to address pressing issues and their proposed ways of enhancing operations over the next five years. One of the main issues outlined in the plan is scientific innovation. CDER admits that not all current trends in the pharmaceutical industry are positive. “Despite the significant advances in basic research witnessed over recent decades, the cost of new drug research and development has never been greater, nor the failure rate in drug development higher” (FDA CDER). It is estimated that the failure rate in late-stage clinical development is around 50 %. One of the major, reoccurring factors contributing to this failure rate is scientific uncertainty. New methods and technologies for gauging the clinical benefits and risks of a drug much earlier in development would help reduce these uncertainties, resulting in a lower cost and risk in drug development. This would allow more innovators to invest in the development of new medicines.
One example of an area that would benefit from a reduction in scientific uncertainty is biotechnology. The pharmaceutical protein products developed by biotechnology processes are derived from complicated expression and production systems that usually involve a genetically modified host cell and growth media. Production conditions can greatly affect the final protein structure, which may result in changes in efficacy or safety of the product. “Understanding the relationships between production conditions, product characteristics, and clinical performance and safety is critical for both innovator biologics as well as biosimilars” (FDA CDER).
The foundation of CDER’s scientific innovation strategy includes addressing scientific uncertainties, collaborating to develop tools and approaches to deal with those uncertainties, qualifying new drug development tools for use in regulatory decision-making, and conducting training on and making informatics and data analysis readily available to reviewers. Once these new scientific standards are established, the FDA can update their regulatory requirements and decision protocols to include these methods. By continuously revising regulatory requirements, the FDA incorporates scientific innovation into their management strategy. All prescription and over-the-counter drug manufactures are required to comply with FDA regulations and therefore integrate scientific innovation into their business and manufacturing model.
CDER’s strategic plan also includes scientific innovation goals and initiatives, including information about when each initiative will start and identifying the key external stakeholders. For example, one of CDER’s initiatives is, “Advancing regulatory sciences related to innovator and generic product manufacturing and quality” (FDA CDER). This initiative is labeled as on-going with stakeholders in academia, regulated industry, research consortia, standards organizations, and other federal agencies. It is interested to note that several of the initiatives have standards organizations as one of the stakeholders. The FDA recognizes that some of the work it does directly affects how standards organizations like ASTM and USP developed their standards. Standard organizations need to aware of the drug approval process and quality regulations outlined by the FDA in order to ensure seamless transition from the research and development laboratory to clinical production and manufacturing. As with many scientific fields, the regulatory bodies and standards organizations of the pharmaceutical industry are interconnected and dependent on each other for proper functionality.


Table 1—CDER’s Scientific Innovation Initiatives
Scientific Innovation Initiatives
Start
Key External
Stakeholders
Advancing the Use of Biomarkers and Pharmacogenomics
2013
·         Regulated Industry
·         Research Consortia
Advancing Development of Patient-Reported Outcomes (PROs) and Other Outcome Assessment Tools
2013
·         Patients
·         Research Consortia
·         Regulated Industry
Advancing Development of Drugs for Rare Diseases
2013
·         Patients
·         Research Consortia
·         Regulated Industry
Advancing the Science of Meta-Analysis Methodologies
2012
·         Other Federal Agencies
·         Regulated Industry
·         Academia
Advancing the Development of Predictive Safety Models, Biomarkers, and Assessment Tools (e.g., through public private consortia)
On-going
·         Patients
·         Research Consortia
·         Regulated Industry
·         Other Federal Agencies
Advancing Social and Behavioral Science to Help Consumers and Professionals Make Informed Decisions about Regulated Products
On-going
·         Patients
·         Research Consortia
·         Regulated Industry
·         Academia
Advancing Regulatory Sciences Related to Innovator and Generic Product Manufacturing and Quality
On-going
·         Academia
·         Regulated Industry
·         Research Consortia
·         Standards Organizations
·         Other Federal Agencies
Advancing Regulatory Science Related to the Manufacture, Characterization and Assessment of Biologic Drug Products
On-going
·         Academia
·         Regulated Industry
·         Research Consortia
·         Standards Organizations
·         Other Federal Agencies
Advancing the Development of Electronic Data Analysis Tools to Enhance Review Capabilities
On-going
·         Standards Organizations
·         Private Industry
·         Academia
·         Research Consortia
·         Other Federal Agencies

Source— FDA Center for Drug Evaluation and Research (CDER): Strategic Plan 2013-2017

Friday, November 28, 2014

Need for standards in computer education


Computer Science is a field that teaches the critical thinking and problem solving skills that are very valuable today, especially in STEM fields. The quality of computer science education in K-12 varies greatly based upon what state and even what school district you belong to. Standards that govern computer science are often bundled with technology standards, causing a majority of states and school districts to treat them as free electives. Groups like the Computer Science Teachers Association (CSTA), Microsoft, and Code.org have begun an initiative to change this by pushing states to offer computer science courses as a core course rather than elective. From 2013 to 2014 the number of states that include computer science as a core discipline increased from 9 to 25 states, a positive trend but not enough given the demand. Computer science jobs are growing at more than 2 times the national average in most states and over 4 times the national average in states like NY, NJ, and California. Data from Code.org shows that by 2020, there will be over 1 million computer science jobs than there are students with computer science degrees if the current education trajectory continues. 90% of schools nation wide still do not offer any form of computer science courses. States who leave these standards decisions to the school districts have been the slowest at adapting to this critical 21st century need.




There has been a major rebranding of computer science over the last few years to make it more accessible and desirable to today's youth. Fred Humphries, Microsoft vice president of government affairs, during a recent panel discussion at Washington “The fact of the matter is if you’re going to have the job of the future, you better have some type of background in computer science”. A study on student interest in STEM fields by students conducted by the ACT shows that interest is high and improving (over 50% of students taking the ACT were interested in pursuing STEM) . This study also found that students who are interested in pursuing a carrear in STEM scored higher in all sections on the ACT and were more engaged in leadership programs. Computer science skills are directly applicable to all STEM fields and give an advantage to those students who pursue a higher education. I believe that a national standard that states would be incentivised to follow is needed in order to speed up the inclusion of computer science programs in schools. Computer science teachers often feel discouraged as their classes are often looked at as being less important than core discipline classes. Courses are often designed for smaller class sizes which bring a slew of logistical problems that need be addressed. State standards need to adhere to the CSTA criteria as many believe that basic usage (i.e. being able to use a web browser for research or a word processor) is enough. Initiatives like “hour of code” which have been attempted by 48 million students and the code.org intro CS curriculum which 99% of teachers recommend need a backing in standards else their effects are constrainded to the single session. Being able to critically think about technology and the social effects that it has is a necessary skill that is being overlooked in schools. I want to live in a world where legislation about technology is questioned by the masses to prevent those in power from taking advantage of them.




http://www.usnews.com/news/stem-solutions/articles/2014/11/25/making-it-count-computer-science-spreads-as-graduation-requirement?int=a13909

http://www.usnews.com/news/articles/2013/12/27/tech-companies-work-to-combat-computer-science-education-gap

http://www.usnews.com/news/stem-solutions/articles/2014/11/19/act-student-interest-in-stem-remains-steady-for-2014-graduates

EPA's New Stricter Ozone Standard

Last Wednesday the Obama administration announces a regulation, released under the authority of the Clean Air Act, to curb ozone emission.  This regulation, predicted to be fully in force  by 2050, aims to reduce the smog causing pollutant from the current 75 parts per billion standard, set in 2008 by the Bush administration,  to anywhere from 65 to 70 part per billion.  According to the NY Times’ Coral Davenport, on one end it is hailed as a “powerful environmental legacy” by environmentalists and public health advocated, while at the other a “costly government overreach” by manufactures and industry.  While the new 65 to 70 part per billion proposed standard is estimated to cost industry $3.9 billion to $15 billion in 2050, it is also estimated to prevent 320,000 to 960,000 asthma attacks in children, 330,000 to 1 million missed school days, 750 to 4,300 premature deaths, 1,400 to 4,300 asthma related emergency room visits, and 65,000 to 180,000 missed work days by 2050.  The EPA estimated that the latter economic benefits would outweigh the former by anywhere from $6.4 billion to $38 billion in 2025 depending on what standard is chosen.   
This new standard proposal is finally being pushed forward after it was halted in 2011.  The EPA originally planned to release it that year but with powerful opposition from Republicans and industry, and the approaching 2012 election, Obama decided to release the delay on the grounds of the distressed economy.  But little has changed; the Republican majority Congress, led by majority leader Senator Mitch McConnell, plan to block or overturn the rule, and others like it.  Directory of regulatory affairs for the American Petroleum Institutes, lobbying for the oil industry, said that “the current review of health studies has not identified compelling evidence for more stringent standards, and current standards are protective of public health.”  While EPA Administrator Gina McCarthy wrote in an op-ed for CNN "Critics play a dangerous game when they denounce the science and law EPA has used to defend clean air for more than 40 year. The American people know better."
                As this battle takes full force, how do these standards get set?  According to The Clean Air Act, the EPA's Office of Air Quality Planning and Standards (OAQPS) sets the national ambient air quality standards (NAAQS) for harmful pollutants and makes sure these standards are met by various monitoring programs, such as the Ambient Air Monitoring Program, Enhanced Ozone Monitoring, and Air Pollution Monitoring.  There are two types of standards set, primary, which protects against adverse health effects, and secondary, which protects against welfare effects.  There are six criteria pollutants that the NAAQS addresses and the Air Pollution Monitoring program monitors, which are carbon monoxide, nitrogen dioxide, lead, particulate matter, sulfur dioxide, and ozone, the one being addressed by the recent smog reduction regulation.  
When an area is found to contain high levels of any of the six criteria pollutants, it is considered a “nonattainment” area.  Levels are measured and reported in accordance to the standards and testing methods developed by the EPA’s Emissions Measurement Center.   States containing nonattainment areas are required by the OAQPS to develop a written state implementation plan in which they outline the efforts they will make to reduce air pollutant levels and reach “attainment”.  But what will this mean for smog ridden states like California, which might end up with many “nonattainment” areas under the new proposal?  According to Scientific American’s Valerie Volcovici, the EPA has “cited flexibility to allow for ‘unique’ situations, such as in California, a massive state with a varied environment.”  But states have up to 20 years to meet the standard before the federal government cracks down.
http://www.epa.gov/airquality/cleanair.html

Friday, November 21, 2014

Re. Time Management Meeting Rescheduled. Reason: Time Conflict


        Time is precious… or so we’re told. It’s oddly one of the few aspects of life that people have actually realized are important. Breathing? Let’s burn these leaves and breathe deeply… I mean, I don’t inhale. Hydration? Nah, get me a beer, some vodka, a soda, iced tea, or maybe just skip the drink altogether. I don’t want water, beer has water in it. Food? Give me the best tasting, fattiest, greasiest thing you have… extra bacon. Or maybe I’ll go with the salad that *gag*, or give me like twenty toppings for that salad and a ton of dressing. Sleep? I’m not *yawn* tired… I can keep going.

But time? We don’t live forever. We only have so much time to get things done. How much time will this take? 20 min? I don’t have that much time. Do you have a TL:DR (too long, didn’t read)? A summary? An Abstract? What’s the elevator pitch? Don’t forget to document everything (which will never be read until something bad happens, and then it will be “why didn’t anyone read this?”), but we’ll only cover the abstracted summary of this elevator pitch. Can we go to the movies mom? Sorry, I don’t have enough time.

As of 8:30pm yesterday, my academic “hell-week” is over. Now it’s project time. But what is “hell-week”? I know enough people use it, or at least understand the meaning. I started using it in high school to define the week of school near the end of the year/semester/etc. that everything becomes due for grading. I got the term from watching some videos about the Navy Seals where, during training, they have a “week of hell” where everything happens and they’re given little to no time to sleep, eat, or relieve themselves. It’s not that bad, but when you have between 4 and (not me, but I know some people) 11 classes, and they all have one or more assignments due, stereotypically on the same day, you start to test the limits of human endurance to get everything done.

One interesting observation is that there are multiple people, all going through the same classes, all on the same schedule… yet they all have a personal standard for time management. For me, I needed to write this post, and decided instead to get food. Is this post that low on the “priority list” that I opted to get food then write it? Yes. But for someone else, they might put off sleep for two days in order to finish their own blog post, and any other work they have. In business, deadlines are made for everything, regardless if needed or not. Going to have lunch? Start at 12pm and finish by 1pm. I should get this abstract completed in 15 min, let me setup a calendar entry.

That actually brings up a different point, planners and non-planners. There are some people, such as myself, who if you look at their calendar; lunch, travel, work, personal projects, phone calls, etc. all have a time and place. Now, if you actually did look at my calendar, you’d say “aha, you have free time here, here, and…” halt right there. For all the planning I do, I’m bad at determining how long a task will take. It’s like a gift to determine how long a task will take. Some people can tell you the exact minute they will finish something, others will be years off. Due to bad time prediction, I leave gaps within my schedule so that I can reschedule tasks without destroying my entire schedule or needing to drastically change plans.

The exact opposite would be people who, when you open this same calendar, are greeted by “Welcome! This is your calendar. You can use it to make schedules.” As in, they don’t plan. I’ve heard stories of people who will decide to go see a movie, then in the middle of the movie, leave to get dinner because their friends are going to a local place. Wait, what happened to the movie? “We’ll watch it after we eat.” They never actually watch the movie, but somehow end up watching some small-town’s firework display for the town’s fair. I tend to view these individuals as having more money than the average person, as they always seem to be out doing something, and never doing something on their own or watching the movie at home with friends, or possibly inviting the friends over to try and make something for dinner. Stuff that not only is sustainable, but always seem to be more fun as they are personal instead of superficial. I tend to interact with these people by making a calendar entry “Might <event(s)> with <person(s)>” so I don’t schedule anything important then, as I may never get back.

In terms of standards, it is something we will never have coconscious on, so we operate our own way. In a business, it would practically be de jure, where you are told when something is due. Often in some top-down manner. Stock holders demand results before competitor, CEO wants results a week earlier, higher-ups want it a week before giving it to the CEO, mid-tiers want it a week before that, base-level bosses want it a month before that. Suddenly, you the worker, need to have a meeting to plan the schedule for when tasks are due, and it will somehow end up being “we’re already late, we need to do this double time”.

Time is precious. I tell people to “make time”, a half-joking, half-serious statement as I try to squeeze as much into a small block of time as possible. While it backfires often enough, it has at least given a drive to accomplish more and look for natural efficiency as opposed to reducing quality in my work. One area where there is a de-facto method of time management is travel to more than one location. While mathematics, physics, and problems like the traveling salesmen problem will prove that it isn’t true, it is considered normal/proper/etc. to arrange destinations for a trip in order destinations by way of distance from the prior point. While some now will say it’s to reduce fuel costs, it mainly stems from people want to reduce travel time. Whoever said travel is about the journey, not the destination, obviously never traveled anywhere.

Time is widely viewed as one of the most valuable resources humanity has, but is also one of the least organized. Even organized systems like time zones, calendars, and clocks, while “standardized” within their form, are used differently depending on who is using it and where. Planners and non-planners will work with time in vastly different ways, and even within each of those sub-groups, they will have different ways of scheduling tasks. But as it took me much more time then I was hoping for, I must go and do other work. Time is of the essence.

The Standard of Gluten intolerance

As a disclaimer for myself, I am going to start this blog by stating that I am in no way ignorant to the fact that every day people suffer from serious allergies.  I have witnessed firsthand how dangerous it can be for a person to experience ill effects from even being around the item they are allergic too and I am grateful not to be affected by such a terrible illness.  People with gluten allergies, commonly known as Celiac disease, have antibodies that attack the small intestine and cause poor nutrient intake.  This being said the new gluten free fad storming across America has caused food stores to increase gluten free sections exponentially.   This rapid increase is a godsend for those who suffer from this illness, but those who hop on board to avoid the poison of gluten can actually be eating unhealthier.
                Gluten itself holds no nutritional value, it is a protein found in wheat, rye, and barley.  Many foods that nutritionists have been recommending for decades contain this protein alongside a variety of beneficial vitamins and minerals.  With the discovery of gluten in the past decade, the mass population is sampling this diet to see if they are affected by the gluten contained in these foods.   There is some possibility of anyone being sensitive to gluten, but a change in diet such as this has a very good possibility of causing more harm than good.  In addition to this when shopping in the gluten free section, the majority of items are similar to processed foods that contain little to no nutritional value. In 2013 The FDA produced a standard that mandated gluten free labels to contain no gluten, but there was nothing that said the foods had to provide a minimum health requirement.  This leads to mass amounts of people searching for a magic diet that is worse for them then what they were originally eating.

                Separate from those who are simply trying to find a magic diet are those that claim they have gluten intolerance and must “cut back” on foods containing the protein.  Almost every study conducted claims that reducing foods containing gluten does little to no benefit to the consumer because even a little gluten can cause antibodies to attack the small intestine.  Like common controversial topics such as MSG in Chinese food causing headaches, or WIFI causing health issues, the discovery of gluten started a chain reaction of followers claiming to suffer from minor irritation due to gluten.  This is complete B.S.!  A study held at Monash University in Australia carried out an experiment containing 37 gluten intolerant people as test subjects.  All were fed a baseline diet that would help a person with Celiac disease, followed by being in one of four groups with diets containing increasing amounts of gluten.  The results showed that every single subject reported gastrointestinal issues regardless of what group they were put into.   If you do not suffer from extreme illness do not avoid gluten, it is not cool and can provide even worse health effects. 

References:http://www.huffingtonpost.ca/2014/05/14/gluten-intolerance-fake_n_5327420.html
http://www.webmd.com/diet/healthy-kitchen-11/truth-about-gluten?page=1
http://celiac.org/celiac-disease/what-is-celiac-disease/

Wednesday, November 19, 2014

Sneak Peak into Martime standards

            This week I wanted to change things up a bit and instead of talking about some arbitrary topic I wanted to give  sneak peek at what I am going to be writing my paper on for this semester. I think this is a great time to do it as it will have forced me to look at where I want to go with it and get some things down on paper.
            Ever since freshman year I have set my path towards a future in Maritime work. Boats and ships and just being out on a lake has always been a passion and a hobby for me and that’s where I intended to shape my future from. Every year I gain a little more knowledge and seek to have as many experiences as possible. I have been lucky enough to have been able to observe a wide range of vessel testing at Stevens in the past year. I feel like I have been able to learn more just from those experiences than in any class. It came to my attention that there is a lot more to the maritime industry than meets the eye. A huge portion of world trading happens over the ocean via cargo ships. Oil, merchandise, food, resources, you name it and it probably get shipped somewhere by sea. In the trade business, everything is about making money. It seems as shipping over water is the cheapest and most economical way to do it. However, when it’s all about money, there is bound to be someone pushing the limits. Stacking another row of cargo boxes or adding just a little bit more product onto the ship; it all adds up until the day that little extra causes a disaster. I will be looking into the shipping industry and how the standards of shipping vessels affect the trade world. I also recently read an article about the increasing amount of shipping traffic in the oceans is causing the whales to physically change their communication frequencies to be heard over the noise of the ships. I will be briefly touching on that, and the standards we need to look at to protect our natural wildlife.
            The other portion to my paper is going to look at the leisure side of the maritime community. Everyone loves to go on cruises! However, I think we have all seen the news lately on the multiple incidences on cruise lines. Between sickness breaking out twice in the past six months and the one running aground overseas, there are certainly some things that don’t seem up to standards. In speaking about these areas in class, a professor of mine has expressed his concern that there will be a significant disaster sometime due to some of the same principles the commercial shipping industry drives off of; money making business, more people per trip etc. This should be an interesting topic to dive more into, to see where the future of cruise lines are heading, hopefully towards higher standards for better experiences.
            I think this topic links a lot to the topics we have covered in our standards class in terms of the auto and food industry. During the development of auto standards the manufacturers were resistant to the changes towards a safer vehicle because the profit margin would not be as great; much like both portions of maritime in seeking to put the most people or product in the fewest amount of loads/trips. Also, we relate to the food industry as health standards started to rise in the 1900s, we need to make sure we have to top standards in handling the food on cruises. When you are hundreds of miles pout at sea, no matter how you look at it, you won’t have an immediate response for food borne illnesses. I look forward to researching this topic and conveying the findings.  

Sunday, November 16, 2014

And you thought SHOPPING for shoes was difficult?



I wear a size fourteen shoe (in most brands), but my foot from heel to toe is obviously longer than 14 inches. So how exactly do brands come up with the standard size for a 14 shoe? Well (not so surprisingly) it’s different in a lot of different places!
            There is an international standard known as the “Mondopoint” system of sizing and marketing, based on foot width and length in millimeters. For example a shoe size of 280/110 means that a person has a length of 280 millimeters and width of 110. But alas, while it makes perfect sense the only people in the world to adopt this system are ski boot makers and NATO military.
In the U.K their system makes sense…to nobody except cobblers in in the 16th century. Their shoes are measured in units called barleycorn (which is exactly 1/3 of an inch), and not measured by the length of shoes but the “last,” which is the wooden device shaped like a foot that is used when making and repairing shoes. Sizes are also commonly measured in “hands.” A hand is equivalent to 4 inches, 12 barleycorns, or 10.16 centimeters. The calculation for a child’s shoe size is = 3X last length in inches -12, and adults is = 3X last length in inches – 25.
In the U.S there is seemingly no specific method used when making sizes. Everyone who buys shoes knows this. Nike’s always run big, while feet seem to fit into smaller sizes with classy shoes. While sizes are basically similar, there is no defined method across the board.
You’re probably thinking to yourself “I stick my foot in that metal contraption with the sliding thingy in the store and it says I’m definitely a size 10!” Well that contraption is called a Brannock device, and when Brannock invented it in 1925, he used a formula that estimates a foot is 2/3 of an inch shorter than the last, which makes a size 1 in Men’s 7 2/3 inches, and it goes up from there. Women’s are one size higher, but the same method.
On the European continent they use Pars Points to measure the last, but each point is 2/3 of a centimeter. Asians use a system based in centimeters and in increments of 5 mm. Then they assign a level of girth to the size, A-G, then indexed in a table. There are different tables for men’s women’s and children’s. So there you have it. The conundrum of shoe sized, oversimplified in my blog. Here’s a fun visual aid comparing shoe sizes for men. 

Emoji Standards 😓 😲 😵

From iPhone's to billiards everywhere, emoji's are here to stay. Glyphs are on the rise. Since they are so hip, let's see how they are standardized, if at all.

Right now, when computers display text, they almost always use Unicode. The Unicode Consortium (Unicode) aims to be able to represent *every* character or glyph made by humans. This includes Latin characters (the ones you are reading now) to Russian and Japanese characters. Since humans are now commonly communicating via emoji, they were added to [Unicode 7.0][unicode-7].

Standardizing emojis has some problems. One of the first problems is how do we ethically choose what the emojis look like? For example, by only choosing white toned emojis, the standard could exclude many of the world's inhabitants. This problem [has been pointed out by previous bloggers][neal].

The problem with emoji's looking different luckily does not seem to be too much of a problem. According to Unicode's [official documentation][unicode], there is no need to standardize emoji colors. From their FAQ, "Unicode does not require a particular racial or ethnic appearance—or for that matter, a particular hair style: bald or hirsute". Unicode even took the time to make sure that not even hair was discriminated against!

Another problem is, what glyphs are important enough to standardize? Including symbols such as smiles are general enough to include. However, what about the symbols that are so specific that it is hard to find a use for them? The strangest glyph I found were a symbol for Lilith (⚸), a hypothetical second moon of the Earth. I am in favor of diversity, but this celestial being **does not exist**, and never has. Why was this chosen over seemingly infinite other useful nouns? There have been a few times I have needed a toaster emoji, but was only able to talk to my friend about Lilith (sarcasm).

[neal]: http://stevensstandardsandsociety.blogspot.com/2014/11/unicode-emoji-diversity-problem.html "Neal Blog on Emoji"
[unicode]: http://www.unicode.org/faq/emoji_dingbats.html#2.4 "Emoji FAQ"
[unicode-7]: http://unicode-inc.blogspot.com/2014/10/unicode-version-70-complete-text-of.html "Unicode 7.0.0 announcement"

Non-Regulatory Food Safety Management

Around the world, thousands of people are employed in the food safety management industry, with millions of dollars invested in food safety research and management, in addition to numerous inspections and tests conducted by governmental agencies and non-governmental organizations. Food safety still remains an issue of utmost importance and a major priority in public health. In response to the increasing awareness and concern regarding food safety, international organizations, governments, non-government organizations, retailers, and producer associations have implemented a large number of food safety management regulations, standards, and guidelines to assure food safety. Some of them are compulsory requirements for food companies (government regulated de facto standards), while other are not. Schemes which are not mandatory requirements from governments are defined as non-regulatory schemes. Most non-regulatory food safety management schemes (FSMS) are voluntary. However, they often become de facto standards, in a business sense, because they are adopted by dominant market players in the food supply chain.

The food and beverage industry is the largest manufacturing sector in New Zealand, and is of paramount importance for the national economy. It consists of about 2,000 enterprises and employs more than 80,000 people. The industry is dominated by several main categories: dairy, meat, seafood, fruit and vegetables, wine, and specialty food industries.

The primary food safety regulating authority in New Zealand is the Ministry for Primary Industries (MPI). The MPI administers four main acts: the Food Act 1981, the Animal Product Act 1999, the Agricultural Compounds and Veterinary Medicines Act 1997, and the Wine Act 2003. In order to meet regulatory requirements, the industry needs to implement risk-based management programs, such as Risk Management Programs (RMPs) and Food Safety Programs (FSPs). These programs have to be independently audited by MPI approved verifiers. Besides regulatory requirements, food and beverage manufacturing enterprises have to meet non-regulatory requirements whether they supply international or domestic markets. A study was performed on the implementation of non-regulatory FSMS in the New Zealand food and beverage manufacturing industry.

A questionnaire was developed based on regulatory and non-regulatory food safety management literature and was issued to 419 food and beverage manufacturers. Among respondents (28.54% response rate), seventeen non-regulatory food safety management schemes have been implemented. These schemes can be categorized into three groups: public international schemes, public industry sector schemes, and private individual firm schemes. “A total of 45 enterprises have implemented public international standard schemes which include Hazard Analysis and Critical Control Point (HACCP) and the ISO 22000 Food Safety Management System (ISO 22000). A total of 18 enterprises have public industry schemes in place, including British Retail Consortium (BRC) Global Standard for Food Safety, Safe Quality Food (SQF) Program and Food Safety System Certification (FSSC) 22000” (Encheng Chen).

A major factor in the inclusion of non-regulatory FSMS for food businesses is the development and implementation of these requirements into their current food safety management system. The process of development and implementation takes time and financial resources. The survey investigated the changes in relation to food quality and safety management, market performance, production cost, and the relationship with customers as a consequence of implementation of non-regulatory FSMS. The study showed that improvement in product traceability, quality of internal procedures, and the food safety awareness of employees were among the most statistically significant desirable changes that had increased. It was also found that the ability to maintain current customers and to attract new customers had increased. Respondents experienced increases in the costs of laboratory tests, record keeping, and training. Other cost increases were in relation to monitoring the production process and internal audits. Respondents also indicated that the number of customers increased, and that they worked more frequently with customers on food safety assurance.

The study addressed some of the challenges that food and beverage processing enterprises faced during the implementation of non-regulatory FSMS. Respondents could have selected up to five important areas in which their firms have encountered challenges. These areas were related to finance, infrastructure, or people. The most challenging area reported were increased paper work, record keeping and document, cost of development and implementation, technical knowledge and skills of employees, resistance to change by employees, cost of training and education, and access to adequate information. Respondents also reported ways in which their enterprises overcame the challenges associated with the implementation of non-regulatory FSMS. Nearly three-quarters of the respondents claimed that they invested in education and training as a method of overcoming challenges. Other respondents noted improvements in internal communications and interventions altering the organizational culture.

This study provided baseline information on the implementation of non-regulatory food safety management in New Zealand, and adds substantially to the understanding of non-regulatory food safety management. The conclusion drawn from this study could help owners of these schemes or the drafters of standards to improve these schemes. Increased attention should be paid to the challenges encountered by food businesses and the critical factors influencing the effectiveness, as outlined in the previous paragraph. With a better understanding of the motivators for food businesses to implement non-regulatory FSMS, the scheme owner could design a system which can maximize the implementation and therefore result in better regulation of food safety. These findings and conclusions can be taken beyond the scope of the study and applied to the creation non-regulatory FSMS in other countries.

Supporting charts and figures:





















Source:
Encheng Chen, Steve Flint, Paul Perry, Martin Perry, Robert Lau. “Implementation of non-regulatory food safety management schemes in New Zealand: A survey of the food and beverage industry.” Food Control, Volume 47, January 2015, Pages 569-576.