Reality sucks! About slightly more than a year ago, I was jobless and was out searching for a dream job. I looked at every job listed on all the job listing sites and newspaper. 99% of those I want to try for needs at least a Degree in the respective field of work. Why is it so important to have that piece of paper? Shouldn’t experience and work attitude be more important?