To go with last week's "college" trend, I'm really starting to think that college just isn't worth it. My roommate just told me that he saw our former RA who graduated last year. He is now employed full time as a cashier at a local Target store. I know quite a few friends who have graduated college and have either moved back home with their parents and mooch off of them, or have a menial, low-paying job somewhere and live in a crappy apartment with no insurance and struggling to pay the bills. Many of these kids were aiming high and had huge aspirations to make it big when they were in school getting their degree, beaming smiles on their faces looking forward to a bright future, but after graduating "reality" hit. The only exception I've seen so far is my cousin, who graduated a couple years ago from Yale Law school (after doing her undergraduate at Harvard) and got picked up by a high-paying law firm in D.C. Quite honestly, I do enjoy academia and learning, but I don't see how this is going to be of benefit to me besides using that degree as a resume filler to more easily attain a job. I'm getting my English degree because I'm idealistic and enjoy the subject; I like reading others' works and learning the nuances of literature, writing, and language. The degree might help me get a job as a Paramedic in the future, but honestly, so does solid work experience as an EMT and good references. I feel like I'm dragging on in college just waiting to get out so I can get started on "real life".