I am developing an application that finds the shortest path between 2 nodes on a graph. The problem is that graph has about 100 nodes, and each node has on average 4 edges connecting it to other nodes.
Would this be a case where conventional short path algorithms be too computationally expensive? If so, what other approach can I take?
Indeed, a network with 100 nodes has a high chance of being NP-hard. A better approach is trying to use heuristics. You could try to apply a scatter search, which results in solutions of high enough quality, but you won't be sure to have the optimal solution.
a network with 100 nodes has a high chance of being NP-hard
As far as I know its not the size of a data set that makes a problem NP hard or NP complete. But anyway, I just finished implementing Dijkstra's shortest path algorithm with 200+ nodes and it runs fine (time wise at least). Loading the data may take some time, but the run time of the algorithm is faster then I expected.