Thursday, August 11, 2011

Does life get better after high school?

I really, really hate school. I hate going to school, I hate being in school, I hate being around everyone at school. It's not even the workload, really. I just feel like everything is plastic. The concepts they teach are rigid, the teachers are arrogant, and the people there tend to have so much enthusiasm that they come across as shallow. I'm almost out of high school (my last exam is tomorrow), and I can't wait until everything's finished and I'm done for good. The only thing is, I'm pretty much an adult now - I'm 18, and this fall I have to start looking for a job. I'm actually a bit afraid of what comes next. The world seems like such a cold, uncaring place. Employers generally strike me as very harsh, judgmental, and condescending. If you're even slightly irresponsible, you could wind up without a job so fast that your head will spin. One false move can land you in jail, and society has no sympathy for people who don't fully abide by the law. Adults are expected to figure things out by themselves. I'm terrified of the prospect that I'm going to be tossed into such a brutal "sink-or-swim" world. A world where I could easily wind up homeless, isolated, overwhelmed, and depressed. I have no idea what career would best suit me. So as excited as I am to leave high school, I'm completely lost. What's there to look forward to?

No comments:

Post a Comment