I just started teaching myself programming again i dont remember why i stopped in the first place but i have to start from scratch again and thats not a problem. I just downloaded Dev C++ and i wanted to know if this was a good compiler to start with or if there is a better one would someone please let me know thanks!
You should definitely try GCC. It's an alright compiler, but when you start your first programs, the compiler will end program shortly after execution. If you get this and still want to continue using this compiler, than go ahead.
Dev C++ is not a compiler. Dev C++ is an IDE; as I understand it (never used it myself) it is an editor and organiser front-end using the MinGW compiler, which is a port of the gcc (Gnu Compiler Collection) which is itself a set of compilers.
The gcc compiler is quite well-regarded. Dev C++ is no longer well-regarded and many recommend against using it (includes me).
If you're just starting, I honestly think you'd do better with a plain text editor, running the compilation directly at the command line. This will ensure you learn what the compiler and linker actually do, and will force you to understand which files are being included and which libraries are being linked against. I see many people who have been learning C++ for quite some time who struggle to understand what's going on, and it does them no favours. Once you've got that grasp, by all means find an IDE that helps you code faster and better.
Others will have their own opinions, of course; I tend to code in Xemacs, write my own makefiles (and various helper scripts) and compile-link from the command line, hence my preference. Someone will be along in a minute to recommend whatever coding environment it is that they use :)
Dev-C++ is an IDE ( not a compiler ) which comes with a very old compiler. Dev-C++ itself is unmaintained, you shouldn't use it
My suggestion is Code::Blocks + GCC/MinGW
See http://www.cplusplus.com/forum/articles/7263/
No shame in using windows; it's the world's most common desktop OS. It's just a bit easier to get right down to brass tacks under some other OSes, and I have a policy of trying to encourage everyone fresh to coding to handle a manual compile-link chain at least once.
That's just me, of course; others take a different view.
I hereby impose a three minute limit on myself, otherwise I'll be talking all day :p
You start with source code. These are plain text files. Under a C++ build chain, they undergo three principal procedures:
1) Pre-processing
#include directives are obeyed (which involves taking the entire file referenced, usually a header file, and copying it into place where the #include was), preprocessor macros dealt with, that sort of thing. You get out of this some changed plain text files. Some of the header files included will probably not have been written by you. They will be header files provided by a library, or by the OS, and it is up to you to ensure that the pre-processor knows where to find them. The pre-processor will have some kind of path to follow that can be changed by you. If you use some header files in some crazy directory way over there in the distance, you must ensure that this information gets through to the pre-processor. How this is done varies from IDE to IDE and pre-processor to pre-processor.
2) Compilation
As a general rule, each plain text file is now compiled into an object file. The object file is a binary file. It is during this stage that the syntax is parsed, optimisation happens, all that good stuff. You get out, generally, an object file for each input text file. If you have used functions that you did not write yourself, but are relying on an already made library for, there will effectively be little blank slots in the object files where the references to these functions go, and the linker must fill in the blanks.
3) Linking
The object files are now linked together to form the output executable or library. If you have used functions that you did not write yourself, somewhere on your system will be an already made library binary file that you need to point the linker at so it can fill in the slots as mentioned above. This can be done in one of two ways; firstly, if you're "statically" linking using "static" libraries, the relevant bits are grabbed and put into your final executable and are there forever. Secondly, if you're "dynamically" linking with "dynamic" libraries, there is effectively a reference put in explaining which dynamic library that functionality can be found in, and when you run the executable you must ensure that the right dynamic library is available on the system (google "dll hell" for the potential downside to this). I'm sure you can think of advantages and disadvantages of both options.
If your linker is unable to find the functionality it is searching for in any of the libraries you have pointed it to, it will report an error along the lines of "undefined reference". Much like the compiler and pre-processor, it is up to you to explain to the linker which libraries it should be looking in, and where they can all be found. I often see people asking about "undefined reference" errors, and they try moving header files around and that sort of thing, essentially because they've just never had it explained what the different steps of the build chain are.
Anyway, I'll shut up now. If you ever do get your hands on a decent command line compiler, I really do recommend you build a very, very basic programme requiring you to pass to the compiler the include directories and the include libraries, and experiment with it.
Code blocks so i think that i will try that or GCC
Code::Blocks is an IDE, GCC is a compiler. I suggest you use them together. (IDE is a toolkit to help debugging and editing, whereas GCC actually makes a program out of it.)