excel

I need help with an excel problem here it is:



On an Excel workbook sheet, solve the differential equation,

dy/dt = sin(y)/cos(t) + 1

using initial condition y = 1.00 at t = 0.00

as follows:

a. Using a table, solve the differential equation using Euler's method, for t = 0 to t =10 in steps of 0.1.

b. Plot the resulting values of y vs. t as a red line. Label this plot stepsize = 0.1.

c. Solve the same differential equation using Euler's method, for t = 0 to t = 10, but in steps of 1.0.

d. On the same graph as part b, plot the resulting values of y vs. t from part c as a blue line, and label this plot stepsize = 1.0.

Your plot should show the difference that step size has on the accuracy of the integrated result, y(t).
This has nothing to do with C++, so this doesn't belong in this forum or anywhere on this website.
Topic archived. No new replies allowed.