C++ / C interesting questions - best practice vs. inovation vs. brainstorming

1) Which data structure would you use to store 1000000 intigers?
2) Which data structure would you use to store 10 000 000 telephone numbers?
3) What search algorithm would you implement for fastest search on 10 000 000 telephone numbers?
4) If you had a C++ program that needs to process and load a lot of information into the memory, and you run out of memory, how would you modify the program?
5) What are best practices to avoid running out of memory when processing large data sets?
6) How would you best, most efficiently, fastest, sort 1 000 000 000 integers?

https://diceus.com/best-countries-to-outsource-software-development/
Last edited on
So what are your answers?

You know, so you're not just a copy/paste machine between us and your tutor.
bad questions are bad.
1) what are you going to do with them? If you put them in a vector, but you want to delete and insert out of the middle of it, that is going to have issues.
2) same thing. What are you going to do to them? If you need to look one up, that is different than if you want to iterate them for 'are they valid'
3) trick question. O(1) is always the best way to search things. If you need speed, make this happen.
4) buy a bigger computer. Define memory... a file system is memory, its just slow. If you mean run out of ram ... there used to be a picture in old textbooks about the tiers of memory. The top tier of memory costs the most per byte and so you have the least of it and it is the fastest. As you move thru the tiers, it gets slower, cheaper, and bigger. The tiers (or a few examples) would be cpu register, cpu cache, cpu second cache, ram and graphics card ram, local disk, remote disk on a fileserver, ... etc. The answer is that when you run out of one type, you use the next best type. The compiler and CPU do this for you at the register and cache level (but you can code somewhat to make that work better by observing how it works). If you are going to smoke and mirror the data to the file system, you have to code for that yourself. (well, virtual memory may do it for you, but its horrid on most OS).

5) you should have had this in your book or classroom. All I will say here is that it is "rude" to run a machine out of memory most of the time. Unless you have a dedicated machine for a very specific task (like beating the top guy at chess with your chess box) you want to avoid grabbing it all so that other programs can function. Most machines have many programs running at once...

Ok, this one is HARD if you don't know a bit of theory and algorithm stuff. So .. I won't prove it, but I will help more here.
6) what kind of integers? Is bucket sort available (O(n)) ? If not, multi-thread standard sort over chunks of it and then sort those bits back together. You could write a dumb merge (not merge sort) that peels the top value off 2 sorted lists each time to put them back together, but std::sort uses a smart algorithm. Depends on what algorithm your standard sort is using, but hopefully it is using one of the ones that approaches O(N) as your list approaches "ordered". If it does, resorting after the threading is pretty good as the list is already partially ordered. If it is not using one of those, the merge is the way to go. That sounds like a large number but it isnt to the machine. std::sort can do 10 million subsecond on a half decent machine (single threaded here) and a decent machine has 6-8 cpus. So you can do 50-80 million per chunk subsecond, finish up and merge it all back in 2-3 seconds flat. And by decent machine, I mean a personal computer. Servers with terabytes of ram and 20-100+ cpus can of course do this before you can blink.


Last edited on
You ask in the beginners' forum, so you like answers from beginners?
its a first or second class in school question. I probably overdid the answer, but you gotta learn sometime. The wording of the questions is very bad, which is an indication that the professor gave them the answers already (probably poor answers) and wants some GIGO from the students.
Last edited on
1) Which data structure would you use to store 1000000 intigers?
2) Which data structure would you use to store 10 000 000 telephone numbers?
3) What search algorithm would you implement for fastest search on 10 000 000 telephone numbers?
4) If you had a C++ program that needs to process and load a lot of information into the memory, and you run out of memory, how would you modify the program?
5) What are best practices to avoid running out of memory when processing large data sets?
6) How would you best, most efficiently, fastest, sort 1 000 000 000 integers?

Quoting the original question, in case this is one of those that mysteriously disappears...
Topic archived. No new replies allowed.