-
Notifications
You must be signed in to change notification settings - Fork 27
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Incorporating Fluctuating Water Levels into rLakeAnalyzer #62
Comments
Hi @brdeemer, Good question. The R version doesn't yet have any of the downsampling or averaging routines that were in the Matlab version. We've often discussed adding them, but just haven't gotten there yet. I did some digging around to figure out exactly what the Matlab version is doing. (this is detailed so I can refer back later if we decide to implement) Basically, it looks to be implementing a simple one-pass moving average filter on the calculated layer depths. (Code here)[https://github.com/GLEON/Lake-Analyzer/blob/master/Source/Run_LA.m#L428-L457] For the start of the timeseries, it pads it with 1:filter_window number of values based on a moving average of the initial estimates. To do this in R is fairly straight forward. #create moving average filter. Filter window in timesteps.
filter_window = 5
b = rep(1/filter_window, filter_window)
x=seq(1, 100, by=1) + 3*rnorm(100)
padx = rep(NA, filter_window-1)
for(i in 1:filter_window-1){
padx[i] = mean(x[1:i])
}
x_padded = c(padx, x)
newx = filter(x_padded, b, method='convolution', sides=1)
#now drop the NA's on the front introduced by the window
newx = newx[filter_window:length(newx)]
plot(x)
lines(newx)
This is pretty much the same algorithm as is used in the Matlab version. Do you want to give this a try? Note: This doesn't do any downsampling first which might be helpful with high resolution buoy data. Also, I just arbitrarily selected "5" for the averaging window. A number like 24 (so daily average window) might make more sense if you have hourly data. -Luke |
I'm going to guess that you have previously loaded dplyr, am I right? It also has a And yeah, rollapply is also a very good (if not better) way of doing the same thing. The only minor nuance would be (if I'm understanding rollapply documentation correctly), the result from the Matlab version is right aligned. But both left and right alignment result in a slight phase delay that is inconsequential if you're averaging to a daily timestep. |
Yes, you are right. I had dplyr installed (and used it for the loop function I posted at the top of this thread). I'm not sure what you mean by left vs. right alignment? Glad to hear that this will work for daily averaging though. Thanks again. |
Right - the average is of the past N observations Something like that. I don't think it matters for your purposes. |
Ok, thanks! |
Hello again, One other question that I did have about incorporating shifting water levels into rLakeAnalyzer regards plotting. Do you know of any way to depict changing depth in the contour time-series plots? Or perhaps you might know of another program that would do that well? Thanks so much! Bridget |
No functionality available at the moment to do that. The matlab version will do the level-varying plotting you're talking about. |
Thanks, I will have a look. Just ran into one more thing. For the Wedderburn number, how would you suggest incorporating shifting lake level into Ao? Specifically, can you suggest a way to link into the bathy file to calculate an estimated Ao as the surface level changes? |
Hmmm, not super easy. You'd need to change Ao to be time-varying Below example passes in Ao as vector. Could actually calculate it from the level adjusted hypsometry data (interpolate shifted data to depth zero at each timestep). Max depth would also need to vary with time in the layer density calculations And you'd need to adjust bathymetry based on water level. bthA <- c(1000,900,864,820,200,10)
bthD <- c(0,2.3,2.5,4.2,5.8,7)
wtr <- c(28,27,26.4,26,25.4,24,23.3)
depths <- c(0,1,2,3,4,5,6)
cat('Schmidt stability for input is: ')
cat(schmidt.stability(wtr, depths, bthA, bthD))
cat('\n')
cat('Schmidt Stabil with level 1m lower:', schmidt.stability(wtr, depths, bthA, bthD - 1)) I roughed this out quickly. Not working yet, but something kinda along those lines ts.wedderburn.number.level <- function(wtr, wnd, wnd.height, bathy, Ao, lvl, seasonal=TRUE, na.rm=TRUE){
depths = get.offsets(wtr)
# Make sure data frames match by date/time.
all.data = merge(wtr, wnd, by='datetime')
cols = ncol(all.data)
wtr = all.data[,-cols]
wnd = all.data[,c(1, cols)]
n = nrow(wtr)
w.n = rep(NA, n)
wtr.mat = as.matrix(wtr[,-1])
dimnames(wtr.mat) <- NULL
for(i in 1:n){
#check we have all the data necessary
if(any(is.na(wtr.mat[i,])) || is.na(wnd[i,2])){
next
}
m.d = meta.depths(wtr.mat[i,], depths, seasonal=seasonal)
if(any(is.na(m.d))){
next
}
#Need epi and hypo density for wedderburn.number calc
temp = wtr.mat[i,]
notNA = !is.na(temp)
epi.dens = layer.density(0, m.d[1], temp[notNA], depths[notNA], bathy$areas, bathy$depths - lvl[i])
hypo.dens = layer.density(m.d[2], max(depths[notNA]), wtr.mat[i,], depths[notNA], bathy$areas, bathy$depths - lvl[i])
uS = uStar(wnd[i,2], wnd.height, epi.dens)
#St = schmidt.stability(wtr.mat[i,], depths, bathy$areas, bathy$depths)
#thermo.depth <- function(wtr, depths, Smin = 0.1){\
#l.n[i] = lake.number(bathy$areas, bathy$depths, uS, St, m.d[1], m.d[2], hypo.dens)
w.n[i] = wedderburn.number(hypo.dens - epi.dens, m.d[1], uS, Ao[i], hypo.dens)
}
output = data.frame(datetime=get.datetime(wtr), wedderburn.number=w.n)
return(output)
} |
Hello,
I am working with thermistor string data from a reservoir that undergoes a large water level drawdown. I have written a loop to incorporate shifting water levels into my calculation of Schmidt Stabilities. I am now working to incorporate shifting water levels into my estimates of metalimnion extent/depth. I have a loop setup:
I get output that looks somewhat messy, though, and I think it would be better if I could use the temporal layer averaging that is described in the Read et al. 2011 Environmental Modeling and Software paper (I found it in the appendix: A.2.1.3 and A.2.1.4). The graph below, which shows the current output for maximum metalimnion depth, gives an idea of what I am working with. The index numbers are ordered by time… earliest numbers are during summer stratified period, latest numbers are during turnover.
Any help that you can offer would be sincerely appreciated.
Thanks so much in advance,
Bridget
The text was updated successfully, but these errors were encountered: