problem with urlmon

i am trying to download a wiki dump from a site
but i noticed somehow after a new dump is available, the program is still "downloading" the old dump?
what i manually downloaded and what my program downloaded is different...

what i am suspecting is that it is retrieving the old temporary file the program downloaded?

is there a way to force download the file?

here is my code:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
#include <tchar.h>
#include <urlmon.h>
#pragma comment(lib, "urlmon.lib")

using namespace std;

int main()
{
	HRESULT hr = URLDownloadToFile 
		( NULL, _T("http://dumps.wikia.net/z/zh/zhpad/pages_current.xml.gz"), 
		_T("C:/Users/James/Desktop/PAD/pages_current.xml.gz"), 0, NULL );
	system("DEL /F \"C:\\Users\\James\\Desktop\\PAD\\pages_current.xml\"");
	system("gzip -d \"C:\\Users\\James\\Desktop\\PAD\\pages_current.xml.gz\"");
	return 0;
}
Last edited on
Why do you need your own program for that? You'd be better off using curl or wget ... in a script/batch file.
well....i am just only a noob =[

and i am trying to download the file, extract it, and process it for my own purpose
(maybe just some self-satisfaction LOL)
and in the above code, i have skipped the unrelated and only showed the downloading and extracting lines

i don't know how to download a file using c++ but i just happened to find the urlmon first
it worked originally, but after the file updates, this problem make me curious.
Topic archived. No new replies allowed.