But it just downloads the source file, I want to download the whole text which I get when I open Inspect Element and copy all text from HTML tag. Sorry for bad english please help.
I am not sure what the problem is. When you request a website the server will send you the source - consisting of html, css and js. like this excerpt from the url above
You need to search what you look for - either with string::find or regular expressions.
Thanks for reply. Here is the problem - From this link I want to take the Experience Points 151,713 but when I search for it it is not found in source code! It is found in Inspect element box though. https://www.americasarmy.com/soldier/Pvt.Phushi
Ok so I read up and it seems javascript adds something.
Stack Overflow - When you say "view source", I'm assuming you're talking about the editor, not the actual "View Source". When you "view source" from the browser, you get the HTML as it was delivered by the server, not after javascript does its thing.
So how do I download the code after javascript does its thing? Not the html which was delivered by server before javascript does its thing?
Ah ok thanks for that Mr.Thomas. So I will look into what you mentioned. Do you recommend me to learn libcurl then? Or should I not? Is it useful to know that(not for this project, in general)?
I think you need to decide first what you want to do and what to use. The above mentioned frameworks all provide their own classes for internet access and if you use them for GUI then there won't be much use for libcurl.
However be aware that all this frameworks take quite some time to learn. If you just need a little app for this website you might be better off asking in the job section. It's a rather easy thing to do and should not cost more than a few dollars.
Ubuntu 14.04 and vim to write c++, I downloaded Qt to check it out. Yea I know it is going to take a long time to learn and I don't get much free time because I am in my senior year, but yea trying to learn :D