serpentsrose answered: No, never has been never will be. Education is a gift that keeps on giving. One that should be regifted to the next generation.
morgaz11 answered: no! college offers the opportunity for people to be educated about the realities of the world. knowledge is power.the people need more power
rjplab answered: Yes. Just go out and do things. It’s not worth wasting your time so you can spend your time repaying dues for that time. Ouroboros, and all.
mikiballard answered: Yes. Companies only care that you have a degree, not what it’s in. You learn most of your trade once you get your foot in the door.
doctorbornwinning answered: I see college as an investment in myself, expanding my knowledge and worldview. To some it may be a waste but not to me.
buxomly answered: If your long-term goal is to make money, just remember that not every college degree can a) get you a job and b) make you money.
musicforthemusicallychallenged answered: Almost all jobs created since the recession have been for college grads so…no.
xxomrshenao answered: Personally, it depends on the person and what they want for themselves. Honestly school is not for everyone unfortunately.
lafaux answered: It is one of the best investments that we can make. College grads get better jobs, make more money, pay more taxes and buy more stuff.
laptopmnky answered: It depends. Hopefully the tougher economic times are forcing people to make more informed decisions on what is right for them.
garrott answered: As of now, no—or at least not yet; a college degree is almost always necessary for professional achievement.
jhenao89 answered: As a current college student, I would say it depends the field you are studying. It’s worth investing that money on a trade, electrician, etc