I am trying to sort some data in a table that i will later use it to plot a graph. I am stumped now and i need an algorith or just ideas on how to implement this.

For a better explanation, i will use these sample values

When i plot a graph using this data directly, i get a step graph if you all know what i mean!. I want the graph to show level decreasing from 100% all the way to 0% and if it increases the graph should also show.Quote:

time(Hrs) Level(%)

0 100

2 100

2.05 0

48 0

In order to achieve this, i am supposed to get time steps and level steps as follows

whereby dt is time interval when calculation of the levels are made. For this case, lets say 5 minutesQuote:

time_step = (t2-t1) /dt

After getting the time_step i should also calculate level_stepHaving done this, i am supposed to graph time level graph decreasing the level by level_step after every 5 minutes.Quote:

level_step = (L2-L1)/time_step

My idea is to create arrays or table show time in minutes and level(%) after every 5minutes. After that i will just fetch the new data and put it in a graph

I need your ideas on the best way to approach this problem

Thank you