Links I’ve encountered #2

This time, it’s impressive web demos using complex animation algorithms and basic browser tools. Oh, and an ML post.

Realistic Terrain in 130 JS lines – Pretty impressive demo. I’m still not sure what goes on with the code because of the all the WebGL & physics involved. However, it shows how far one can go with simple browser tools.

Creating 3D worlds with HTML & CSS – It seems like the new fad is doing nearly everything in basic CSS(3) that was done with JS before. This demo reminds me of countless hours wasted on CS:CZ.

Neural Networks, Manifolds & Topology – Good article with impressive visualizations to see what goes underneath NN’s hidden layers. It’s not uncommon to hear hidden layers referred to as black-boxes, however the article sheds some light underneath the hood.

Advertisements

The New Econometrics?

First, let me get this out of the way. This is going to be an emotional post. Knowing myself, it will end up reading like a rant riddled with spelling errors (so not much of difference there). Why, you ask? Because I care about Economics and I’m mad we were robbed a good topic that any Econometrics student should be offered. We are still robbed. So read this like a letter to both students and professors.

Econometrics is about data. Econometrics is about analysis and distilling information to obtain the best picture within the data to mimic the population at large. This isn’t statistics used to unearth correlations but, like any self respecting economists, to unearth causations; always trying to answer the why within a phenomenon. So, on one hand I can forgive the unfortunate avoidance of economists shying away from heavy data crunching. However, it is an unforgivable sin to not mention, at least in passing, the wealth of options surrounding students.

So what is this thing I keep raving about? Well, it is known as data mining and/or machine learning(ML). I will avoid explaining the differences between the two (mostly because the answer is a bit vague, especially for the scope of this article) 1. To explain the field itself, it is using algorithms seep through data and obtain meaningful relationships. Alright, that sounds kinda like Econometrics. And that is exactly my point. Having knowledge of the field makes you an even more complete econometrician. Remember all the linear regressions you made in Econometrics? Well, that is the first algorithm found in intro to ML. Basically, the entire semester I spent learning Mathematical Economics (which is like advanced Econometrics) was over in a week. Then came logistic regression (an even more useful algorithm). Then came neural networks. Then came feed-forward neural networks and (wait for it!) backpropagating networks. OK, I will stop with the forced revision. My point still stands on exciting and useful algorithms that can be used to detect relationships and avoid errors.

Ignoring the hype with big data 2, think of how much data is generated every single second. Think of events happening that were once hard to measure/track: mobile phones, geo-location, PaaS, SaaS and multiple ways fixed costs have become variable costs. Hal Varian puts it best,

“There is now a computer in the middle of most economic transactions. These computer­mediated transactions enable data collection and analysis, personalization and customization, continuous experimentation, and contractual innovation.Taking full advantage of the potential of these new capabilities will require increasing sophistication in knowing what to do with the data that are now available” 3

I should note that Hal is the main reason I am writing this post. He works as the chief economist at Google. He has also written one of the most intriguing papers (Big Data: New Tricks for Econometrics) 4 concerning the future of Econometrics – a must read if you have read this far.

I pointed out exciting algorithms that might change the way we approach analysis. Some algorithms correct and anticipate their own errors! Not even joking! Remember when we had to account for bias in sampling? Well, ML has a better solution for correcting for this automatically 5. All me to quote Varian again –

“Our goal with prediction is typically to get good out-of-sample predictions. Most of us know from experience that it is all too easy to construct a predictor that works well in-sample, but fails miserably out-of-sample. To take a trivial example, ‘n’ linearly independent regressors will fit ‘n’ observations perfectly but will usually have poor out-of-sample performance. Machine learning specialists refer to this phenomenon as the ‘overfitting problem.’ ”

To be on point, you end up having algorithms that penalize themselves 6

It will be unfair to blame undergrad Economics’ syllabus for not including ML concepts. I should note that most of these concepts are relatively new. Case in point, Varian’s paper is still a working paper (last revision a week ago as of publishing this post). ML is also mostly computer science driven. The algorithms are not written with Economic theories in mind. This should not be an excuse however because inter-disciplinary studies are not uncommon. There is also the lack of basic coding knowledge associated with most economics students. I, personally, believe any student taking econometrics and wants to go into the field should, at least, have basic coding skills but that is an argument for another day.

In hindsight, this stopped being angsty rather quickly. However, I am still disappointed I missed out on exciting new topics during my earlier economic analysis lessons. Let this be a lesson to any econometrics student. There are mind-blowing projects and ventures popping up. You should not, however, think you will stop predicting wages versus education and age. That thing haunts you everywhere. Seriously, it’s everywhere!

1. [Stack Exchange has a good discussion on the differences.]

2. [I don’t think it’s even hype anymore. You know it’s mainstream when government scandals are invited to the party!]

3. [Varian, Hal. 2014. Beyond Big Data.]

4. [Varian, Hal. 2013. Big Data: New Tricks for Econometrics.]

5. [I understand that some of these methods are already applied in certain Econometrics works. Feel free to point out other interesting projects using these methods.]

6. [One of the funniest tweets from ML Hipster. You should follow him.]

Alternative Cross-Origin Resource Sharing (CORS) Techniques

My first encounter with Cross-Domain Request was when I was creating the UK football league stats table. My first approach was to download the data in a csv file and load it from my server. However, I discovered that it would be inefficient to keep replacing the file with the most recent data after each update (i.e. there are matches played at least once/week). I looked into ways I could easily query the data automatically from the source itself. I tried using AJAX but I had some CORS limitations. In short, the csv file source domain did not allow anyone (specifically any website) to query their data. In the end, I ended up just using a server side solution (which is probably not legal). I was inspired from then to research1 several ways one can implement CORS in different scenarios. Below is a sample of ways one can implement this technique.

Image Ping:

This works for GET requests. You won’t be able to read the response text. However, it’s a good way to make sure the target server receives a notification from the origin page. From what I’ve learnt, it is one of the ways ads can track their views. You can also track user-clicks (or general interaction) without unnecessary interruption.

var imgCORS = new Image();
imgCORS.onload = imgCORS.onerror = function(){ 
 console.log("Done & Dusted");
};
//we assign both onload and error the same fn to ensure we capture both responses from the server
imgCORS.src = "http://www.example.com/track?clickedLink=3";

Script tags with JSON Padding (JSONP):

JSONP uses the script tags to communicate with other domains. It has a leg over the Image ping due its ability to read requests. To summarize JSONP, you create a script tag, you assign the source in a special formatted way, and finally you use a callback to read the response. Things to keep in mind:

  • The target source has to be ready to process your request
  • The requester is at the mercy of the target source because the callback will be executing whatever the target source sends back.
  • There is no defined method to process errors due to lack of error handling in browsers (setTimeout can be used but it would be assuming the same connection speed on each user)
function handleResponse(response){
  console.log("Your name is  " + response.name + ", and you're " + response.age + " years old.");
}
var script = document.createElement("script");
script.src = "http://example.com/json/?callback=handleResponse";
document.body.insertBefore(script, document.body.firstChild);

Comet/Long Polling:

I have to admit that I’ve never used comet, long-polling, or any form of server-push. My understanding is from a purely theoretical view. Feel free to correct me if need be. To explain it, comet/long-polling is using the server to push data instead of AJAX requesting data. This way you get a real-time response from the server (think sports scores/twitter updates). Short polling involves the browser requesting the server at regular intervals but long-polling reverses the process and holds the gates open (so to speak) until it has something to send. To summarize:

  1. Browser opens up a request
  2. Server holds the gates open until it has something to send back
  3. Browser receives the response from server and closes the request
  4. Browser immediately goes back to #1
function createStream(url, progress, finished){ 

  var xhr = new XMLHttpRequest(),
  received = 0;

  xhr.open("get", url, true);
  xhr.onreadystatechange = function(){
   var result;

   if (xhr.readyState == 3){

     //get only the new data and adjust counter
     result = xhr.responseText.substring(received); //read from last end-point
     received += result.length;

     //call the progress callback
     progress(result);

   } else if (xhr.readyState == 4){
     finished(xhr.responseText);
   }
  };
  xhr.send(null);
  return xhr;
  }
  var client = createStream("/streaming", function(data){
   console.log("Received: "+ data);
  }, function(data){
   alert("Done!");
  });

Server-Sent Events (SSE):

SSE is an API for read-only Comet requests. It supports short-polling, long-polling & HTTP streaming. That’s as far as my knowledge of it goes. Read up more about the API here

Web Sockets:

There’s far too much that has been written about Web Sockets. In short, Web Sockets are better versions of comet/long-polling. You should keep in mind that Web Sockets don’t operate on the standard http (hence ws://example.com). So, how does it work exactly? I can verify that it is all magic!

1. [Majority of this information (and code) was obtained from Nicholas Zakas’ Professional JavaScript for Web Developers. The rest I tried to link to the original sources. Let me know if I missed anything!]

Links I encountered

I usually try to write a post at least twice a month. However, lately I’ve been swamped with new learning materials that completely absorb me (I like telling myself so!). I think it is fair to say I’m addicted to novelty. To cut to the chase, I’ve been learning lots of machine learning and data mining techniques of late. I’m a Stats guy by education, so it’s only fair to be drawn to the fields. To mediate, I’ve decided to post insightful links I encounter when I’m not writing blog posts. I’ll try to not abuse this idea and actually write actual insightful posts when I have time.

The Passion Gospel – I’m far from experienced but this article (read: post) made me feel like a veteran. Are you a noobie programmer? Are you desperately trying to get into the industry? Do you want to know what it’s like? Please read this. Just make sure you don’t come out of it a cynic!
PS: Go read up on several of his posts. To say he’s a brilliant writer would be an understatement.

GitHub Isn’t Your Resume – GitHub has slowly massaged itself into the modern developer’s toolkit. No LinkedIn recruiter will listen to you without a GitHub account. You have a website? Better have a link to your GitHub on it (guilty!)! I love open source. As a matter of fact, I owe it most, if not all, of my programming knowledge. However, don’t make it mandatory. After all, you’re removing it’s fuel (i.e. people deriving pleasure from helping others without any strings attached).

I know the two articles are somewhat negative but I’d like to think of them as reality checks. For positive news, click here

JavaScript and Floating Points Arithmetics

We (i.e. anyone who has played with JavaScript) all have heard something about how floating points are tricky and borderline impossible to deal with. While this is not exclusive to JS, it is worth knowing a thing or two behind the limitations of dealing with floating numbers.

Let’s start with a well known example:

var a = 0.1 + 0.2;
a === 0.3;   // false
console.log(a);   //0.30000000000000004

The only way to deal with this is to use the toFixed() property from the Number object or to convert everything into integers, perform the calculations then convert everything back into decimals. Both methods are not guaranteed to produce the correct result, especially when dealing with complex calculations with various floating point variables.

I found out the best way to understand floating point problems is to use the decimal system most humans are so used to. Try expressing 1/3 in a decimal system in the best way possible. There is literally no way to express it to its precision. There are hacks, like 0.333... repeating, but these are all ways that confirm our lack of expressing 1/3 in decimal. Something similar is happening with JavaScript and floating points.

Anyone who has taken an intro class in Calculus will be familiar with the Zeno’s paradox. To summarize it, 1 + 1/2 + 1/4 + 1/8 + .... will always approach 2 but never be equal to 2. This is because we are always halving our distance from 2. That is exactly what is going on when JavaScript tries to express some floating points.

Consider this Binary code:

Integers:
Binary: 1 => Decimal: 1
Binary: 10 => Decimal: 2
Binary: 1101 => Decimal: 13

Floating points:
Binary: 0.1 => Decimal: 0.5
Binary: 0.0101 => Decimal: 0.3125
Binary: 0.00011001 => Decimal: 0.09765625
Binary: 0.00011001100110011 => Decimal: 0.09999847412109375

As you can see from above, the binary value is getting closer and close to 0.1 (in Decimal) but never actually equals it. It is a shortcoming of expressing certain floating points in binary; in the same way we can never fully express certain floating points (e.g: 1/3) in decimal. You can try this with pretty much any base system (try expressing 0.1 (decimal) in Base 3).

To answer our original issue (i.e. 0.1 + 0.2), calculations are usually transformed into binary, evaluated then converted back into decimal. With its 32-bit limitation, the expression is limited to only 32 floating points. It then becomes:

0.00011001100110011001100110011001 //approx. of 0.1
+ 
0.00110011001100110011001100110011 //approx. of 0.2
__________________________________

0.01001100110011001100110011001100 //the actual result in binary to be converted into decimal

Want to try something even more fun?

for(var i = 0, x= 0; i<10;i++){
  x += 0.1;  //increment x by 0.1 ten times
}

console.log(x); //0.9999999999999999

PS: I should emphasize that this isn’t something that is unique to JavaScript. Most languages by default have this issue. I just used JavaScript because it’s the most comfortable/easy language to express the idea.

One Year Later

Twenty-nine posts in and it has been one year since I started running this blog. To be honest I thought I would have given up on it after several weeks. However, the page views have been somewhat surprising and encouraging. The responses have been enough to push me to publicize some of my projects. Henceforth, I will try to summarize some of the projects I have been involved with in the past year.

UK Football League Stats: This is my latest project (as of Jan 8th, 2014), and surprisingly expeditious. Coming from a stats background, I have always been keen on learning d3js, and this was the perfect opportunity to infuse the library with football and statistics. A testament to my vibrant social life, I churned out the first working draft on New Year’s eve. It scraps up-to-date csv data from Football Data’s website using php, processes and displays the data using d3, and finally adds interactivity on the tables using datasorter and DataTable jQuery plugins. Despite its short turnaround time, I have learnt the most from it. I started with hosting the csv data on my server (manually uploading the files), only to realize I would need a better solution to make sure the data is always up-to-date. This prompted me to learn some basic web-scrapping and their limitations (mostly CORS related). The project alone has inspired me to learn more on web-data scrapping.

SoundBum: One of my longest running projects; it uses Soundcloud’s API to play random music from any genre. I got the idea from working on CodeAcademy’s Soundcloud API tutorial. Previously, it had a pretty basic functionality. It prompted you for a genre, and returned a random song. In the last month or so, I decided to make it slightly more interesting. It can now auto-load a new song whenever the current one ends. This was a tricky hack which led me to learn about iframe’s sandbox attribute and foreign API limitations. It is also the only app I actually use on a daily basis. Future versions might include a better UI (currently it only has an input, button and an iframe) but I am not sure of the direction yet.

GIF to Canvas: In my desperate attempt to play the JS/HTML5 game like all these hipsters, I created a simple animation with playback controls. If possible, I might include some web-scrapping to create these animations by automating the entire process.

Guess the Number: A simple game that employs Binary Search Algorithm. It uses basic jQuery and CSS. It is an UI improvement on a previous implementation.

There are many more smaller JavaScript challenges I have worked on, but they are more of 30 min challenges I use to brush up on my knowledge (and learn new interesting tid bits). These tend to inspire me to write tech related posts.

In the next year, I am hoping to become more collaborative in my coding. The other day, I had my first pull request accepted. It felt strange yet exciting directly contributing to an open-source project, despite it being a 3-line code update. Hopefully many more to come. Here’s to another year!

The XY Problem

The beauty of technology problems is that they are applicable in so many different fields, including daily tasks. Back in high-school, I used to deliberately leave some parts of assignments half-assed just so I would be asked to re-do them. I would always make sure these were sections that I was more more confident in my ability. It would always lead to instructors ignoring the parts I was less confident in their quality. Later on, I had my moment of clarity when I learnt about Parkinson’s law of triviality. It is a phenomenon that spreads as far as management, one among many. I cannot count the number of times I have abused this technique.

Of those problems that seem to persist in a cross-field basis, I have recently been guilty of one that tends to lurk under the radar. Let’s say an arcade owner has a problem with counting coins. He decides to employ ten people for $8/hour. He then struggles with the logistics of organizing the 10 people to finish the task in a timely manner. He goes out and asks his friend on the best procedures to organize 10 people in a factory line. He ends up with even more complicated situation after his friends mentions the fact that two five-person groups seem to work better than one ten-person group. He now has another problem on deciding on how to divide the group of ten into the best five-man packs.

Ignoring the terrible thought-out hypothetical scenario, the arcade owner is at fault for not realizing what problem he was solving in the first place. He needed to find the optimal way to count coins at the end of the work day. In the way he went to seek for help, he avoided to mention his primary problem. Alternatively, he needed to mention his coin-counting problem that led to his decision to hire ten people in the first place. His friend might even mention about the possibility of leasing a coin-counting machine, a much cheaper alternative used by all arcade owners.

Like any developer, I tend to scour online help forums. I cannot count the number of times when the first response to most questions is “What exactly are you trying to do?”. An old post from Usenet describes it as an XY-problem. In short, one wants to accomplish task X. He is not sure on the solution to X. So he comes up with a solution Y. He is not sure on the best way to implement Y. He asks for the solution to Y, assuming that by solving Y he will end up solving X. Those trying to help fail to understand why one would want to solve Y, usually because Y is a strange problem to solve. In the end, no one is usually happy.

I think it is a safe guess a good number of these questions were trying to obtain the file extension. Instead of directly referring to their main problem, they came up with a solution which assumes that all file extensions are three characters long (HINT: not true). The issue is so pervasive to deserve its own wiki with numerous examples.

In all this seemingly noob-bashing (and by extension self-bashing), I feel some of it is accidental. In the arcade owner example, he probably does not know that other arcade owners are faced with the same problem. Maybe he is the only arcade owner in the area. Maybe he is a new arcade owner without a clue on the best practices. While it is easier to blame the asker for their lack of knowledge, ignoring their position is equally unfair. It is too easy to forget the number of times we all assume our problems are unique. The main lesson should not be how to ask questions but rather most problems are not unique. Sometimes that lightbulb might just be a firefly.

Why eval() is evil

If there was one function that was truly loathed by Javascript Developers, eval() would be an easy winner. And for a good reason. A lot has been written about eval() and people have gone into great detail why the function should be avoided at all times (and a few contrarians have tried to defend it…mugs). However, I will not try to rewrite what a five second google-search can explain. I will use a simple example because those seem to drive the point home.

Let’s assume you have a little secret you want to keep away from a user (although if you use front-end environment to keep secrets, you’re not really good at this secret business). You secret happens to be 'It's a little secret'. You store this secret in a creatively named variable, myLittleSecret. You forget about this secret and write another function to evaluate what the user inputs.


  function givethBackToUser(){
    var userInput = prompt('Give us something and we will return it; we promise!');
    eval('alert("We returned: "'+userInput+')');
  }

In the above example, an alert box will return with whatever the user input was. Unless the user happens to assign myLittleSecret as their input. Instead of returning myLittleSecret as a string, it assigns the value of myLittleSecret to userInput. Which means, the user will see   We returned: "It's a little secret". In the end it won’t be a little secret after all.

There is a lot more harm that can happen when using eval(). The main point is that everything you can do with it, can be done differently and efficiently. There is just too much that can go wrong when you are at the mercy of your users.

Javascript Recursion, Variable Scope and Hoisting

One of my most popular posts involved javascript function declarations vs function expressions. On the post, I mentioned something but never went into detail how it works properly. In this post, I’ll try to address it by using an example I encountered recently.

While going through an online javascript quiz, I couldn’t figure out why my code wasn’t passing. The task was returning the sum of all integers when given an array. The array could contain any data, including other arrays. By any means, my solution should have been returning the correct result but it still wasn’t responding the correct way. I had to do some line by line debugging and considering it was a recursion function, things can get tricky. At the end, I was certain the recursion function was the culprit.

My initial solution looked like this:

function arraySum(i) {
 sum = 0;
 for(var x = 0; x<i.length;x++){
  if(typeof i[x] == 'number'){
   sum += i[x];
  }
  else if(i[x] instanceof Array){
   arraySum(i[x]);
  }
 }
 return sum;
}

This one obviously failed and would return the sum of the innermost array. If you tried to run arraySum([1, 2, 3, [4, 5]]), it would return 9 instead of 15. This is seemed obvious due to the return line operating in the recursion function breaking off.

The next solution was the trickiest one to figure what was going wrong:

function arraySum(i) {
 var sum = 0;
 for(var x = 0; x<i.length;x++){
   if(typeof i[x] == 'number'){
   sum += i[x];
  }
  else if(i[x] instanceof Array){
   arraySum(i[x]);
  }
 }
 return sum;
}

This function would only return the sum of the outermost numbers and never include the sum of the inner arrays. If it isn’t obvious yet (it wasn’t obvious to me for about 30 minutes), variable scope was at play. In the original function, due to the order of the conditionals, it would return the correct result if, and only if, the array passed had the innermost array as the first element. If you had 3 or more nested arrays, then the deepest array should always be the first element of the parent array. If you passed arraySum([[4, 5], 1, 2, 3]) in the initial function, it would return 15 which happens to be the expected result.

In the second solution, only the sum of the outermost array would be returned. Both arraySum([1, 2, 3, [4, 5]]) and arraySum([[4, 5], 1, 2, 3]) would return 6. This ruled out the order of the elements and highlighted the variable scope complications present.

Variable scope refers to the extent of the availability of a variable in code. In javascript, variable scope is prominently present in functions. If you declare a variable using var x in some function y(), x will only be available inside the y().

Applying this logic in the second function, it becomes obvious why only the sum of the outermost numbers were the only ones returned. Precisely, refer to the second line in the second function (i.e. var sum = 0). Every recursion function had its own scope of the sum variable. Hence every returned sum value by the inner functions wasn’t available to the outermost function

Taking this into consideration, the final solution looked like this:


function arraySum(i) {
 var sum = 0;
 function sumArr(i){
  for(var x = 0; x<i.length;x++){
   if(typeof i[x] == 'number'){
    sum += i[x];
   }
   else if(i[x] instanceof Array){
    sumArr(i[x]);
   }
  }
 }
 sumArr(i);
 return sum;
}

To avoid the variable scope issues, I opted to employ a closure. It ensured neither the order nor the scope was going to change the final returned value. There are other impressive hacks that avoid using recursion altogether but I opted using a closure to illustrate the problems one might run into when faced with recursion functions.

On a funny somewhat unrelated note, try this in your browser console:


var definitelyNotANumber;
definitelyNotANumber += 2; //NaN
typeof definitelyNotANumber; //wat?

Brave New Joker

I got this off a reddit post. It’s written as the birth of the Joker set in the novel Brave New World. One of my favourite pieces of writing.

“They kept telling me ‘everything was going to be alright'” she said as she was handed another soma. A look of expressionless tranquility ran across her face, it started in her eyes and she broke the silence once more. “I don’t know what I was so worried about,” she said. “everything is going to be alright.”

I was born an Epsilon. I trust in Ford that I am where I am because of who I am. I’ve been told I am not the brightest, or else I would be a Delta…I was told if I didn’t worry s- pause takes out a orange RX bottle …They told me that it was fine. They told me one of these will keep me perfectly efficient; like the model T. In Ford I trust.

My name is James. I am an Epsilon. I work hard so that Gammas can work harder. Everything going to be alright if I just follow Ford. Last time I went to refill my soma the Betas called out my name and asked for me to come over to them. I complied, and walked over briskly

“Epsilon 80010, you have exceeded expectations,” Said one of the Betas. “You would do well to be rehabilitated for Gamma duty”
I was worried. I checked for my soma, but it was empty. That’s when it started.
“Epsilon 80010, if you are willing to outfit yourself to the Gamma quarters we can begin transition as soon as possible”

I became anxious. I wasn’t happy. My life was changing. The model T did not change. Why should I? What’s wrong with me? Why can’t I stay here? I was lost in my thoughts. The Beta noticed me shaking and grabbed my arm. Then it happened. I couldn’t stop him.

An alpha, the greatest grandson of Ford himself stepped in. He gave me a soma- at least I think it was. It was blue and white.
“Here” He said turing to the Beta “I see you are preparing this Epsilon for his transition to the Gammas. This is highly unusual.”
“I agree sir.” The beta muttered

I put the pill in my mouth
“Why haven’t I been informed?” questioned Alpha ford.
And swallowed. My mouth was dry. I could feel it go all the way down into my stomach.
“Sir, you were not deemed necessary for the maximum efficiency of this move” explained the Beta, “He is one epsilon. You should attend to your duties as the alpha you were created to be”
“You’re right Beta.” Sneered Alpha Ford, and he walked away. He glanced at me as he was walking, smiling. I was worried.

everything is going to be alright

I followed the Beta to the Gamma quarters. Not a single Gamma spoke to me. The beta explained it was because my former Epsilon self would inhibit seamless social activities. He recommended I go to sleep and wait for the morning. I didn’t understand, but obliged him.

The next morning I woke up. I could feel the colors on the wall. I could taste the sunrise. It felt different. In a panic I went for my soma, only to find an empty bottle. This wasn’t good. Everything was going wrong.

The next morning I woke up. I could feel the colors on the wall. I could taste the sunrise. It felt different. In a panic I went for my soma, only to find an empty bottle. This wasn’t good. Everything was going wrong.
I raced to the commons to solicit any soma from any gamma. Shunned, and unfamiliar, dozens of Gammas began to open a whole for me until I was surround by 50 of my new peers.

“What is it Epsi?” taunted a faceless voice.
My heart began to race.
“Are you worried?”
my hands were growing clammy
“Do you belong here?”
I began to close my eyes and think of Ford. I began to think of his grand son. I began to think of that pill.
“SOMA” I exclaimed.
A young gamma came up to me. She was much smaller than me and asked me to bend down. She began to whisper “everything is going to be alright” and handed me her last soma. I quickly took it, but Before I could thank her I realized she was gone.
I waited for everything to calm down. I waited for everything to be okay. I knew that everything was going to be okay. The crowd around me started to disperse and go on with their day. It was a perfect day.
“You don’t belong here”
I was standing alone
“It wont work anymore”
I began to worry
“You are not a Gamma”
I covered my ears to try to block the voices. They kept coming back “James will be an Espi forever” mocking me. I closed my eyes and thought of Ford, I thought of the model T. To no avail. They began to multiply.

“You are not the same” “You have been perfected” “Smile” “You are not a gamma” “Smile”
They were mocking me: A low level Gamma. I collapsed on the ground and closed my eyes. Looking up at the ceiling I could see the girls face, the one who had given me the soma as she bent over me. I looked up at her and she said “Smile, everything will be alright”

The next few weeks are a blur to me. I was put into a room by myself for Epsilons who had graduated to Gamma level, or the “Divergent Chamber” I was kept on watch with a bottle of soma, a mirror, a bed and a television. Every morning I would wake up to Master Ford. Every night I would fall asleep dreaming of the model T. But everytime I would close my eyes a chorus of nagging voices began their awful symphony.
“Let’s put a smile on”

I took as much soma as I could to stop the voices. After the third bottle I realized that I couldn’t feel it anymore. That it wasn’t strong enough for me. I told the Beta who was watching me. She told me that this was to be expected.

“Everything will be okay/ Let’s put a smile on”

I began to stop feeling most things. I only began to feel terror. I was anticipating the voices at all moments of the day. The only tether I had to reality was the mirror.

One day. While the beta was changing my bed sheets the voices came back with a vengeance. “Smile, darling” “Everything will be alright” the beta had his mechanical instruments in the corner. “Take it james.” I looked closer at it and could see the glint of a small blade. “Smile” I stood up. The beta became confused. I started to walk over to the corner. He grabbed my arm.

I froze. I didn’t feel the terror anymore. I felt the beta. HE felt my terror. I smiled.

“Everything is going to be okay, let’s put a smile on that face”

I checked my smile in the mirror and left the room. Leaving a vacancy in both the Beta and the Gamma levels