Is it imperative that you have a degree to work in America? I'm not talking about working in a shop or something like that but as an IT Professional or something similar?
Printable View
Is it imperative that you have a degree to work in America? I'm not talking about working in a shop or something like that but as an IT Professional or something similar?
I don't think you need a degree but it can help you a lot.
However if you were to work in America you would have years of experience under your belt which will probably help you even more. A degree in a sense is just a way of proving your skills and validating that you understand things and it can help if you have a higher educational qualification to use. But as i said, you've got years of experience under your belt which will be very useful as well as beneficial should you work in the US.
To do white collar work you should have a degree.
Imperative!
ClouD uses Impervious, deflecting the attack.
But to make actual money it's practically a requirement to start your own business.
Unless you're an accountant, doctor, or lawyer.
In England this might be the case, but I don't think for America it is.
It is for white collar work. We just call it office work lol.
I will be setting up my own company in the next year or two, hiring myself out to businesses as a contractor. It is easy to do here and something I will set up in the next couple of years, but would love to work/live in America for a couple of years but worry that without a degree it would be pointless.
I will have my diploma in Business Analysis by then but this is really just a certification in my line of work. It is not a degree. Also I don't see any real demand for contractors in America either, which worries me more. Maybe I should look to work for a company with offices in America.
Thanks for the info guys :)