Was just watching tv and a thought came to me.
So soldiers risked their life on war field and fighting for the country.
Then what happens after war?
Kinda like they lose their job isnt it?
Is fighting a war for the country really just a job?
Do they really go out of job after the war?
Whats left of the skills after that? How are they still relevant to the society?
Martial arts trainers?
Do they then continue on the new “job” with pride, or the broken and with wreck up mind?
All the brain jizz after the war… are they still mentally capable to handle it?
What about us then? How do we looked at our job?
Do we risk our life and go all out for our job?
Do what we do requires such dedication?
Who decides it anyway?
What if we take our job and really be liken to a soldier on the war field?
What would be left of us?
What would be the outcome of the tasks we uphold?
What would it be like?
What would it take for one to give everything that one has?
Thanks for coming by