Welcome to the discussion page! #216
Replies: 7 comments 4 replies
-
Firstly, thanks for the great contribution on dtscalibration! I wonder if I could find more information on how to perform the single-end calibration by inputting variance derived from “variance_stokes_linear”? It seems the link of https://github.com/dtscalibration/python-dts-calibration/blob/main/examples/notebooks/04Calculate_variance_Stokes.ipynb had been removed from the website? Thank you! Best, |
Beta Was this translation helpful? Give feedback.
-
Dear Bart Schilperoort,
Thanks for the prompt reply!
I try to calculate results from variance_stokes_linear and input them to the function of single-ended measurement.
I see no example in the document. Could you please provide me a one? Or just a simple script will be appreciated.
Thank you!
Best,
Chin-Jen
|
Beta Was this translation helpful? Give feedback.
-
Hi Bart,
Many thanks for providing me these information!
I will read in detail and let you know if I still have questions.
Thank you!
Best,
Chin-Jen
|
Beta Was this translation helpful? Give feedback.
-
Hi Bart,
I have another question. I remembered I saw an example in the document showing how to down-sample the raw data.
However, I couldn’t find it in the document now. I also got problem when using the command of “ds = ds.resample(time = "60S").mean()” and perform “calibrate_single_ended” later.
Could you please show me how to down-sample data correctly?
Thank you!
Best,
Chin-Jen
|
Beta Was this translation helpful? Give feedback.
-
Hi Bart,
Thanks for the hint!
It seems the function below will make DataArray transposed and thus not accept by the function of “variance_stokes_constant”. I will try to fix it.
ds1 = ds.resample(time="60S").mean(dim="time", keep_attrs=True)
I have another question regarding to the Silixa ULTIMA and wonder if you could help. How long in time it takes for each laser shooting and measurement? When we configure DTS in sampling rate of 20 s, does it gives the same resolution when we average two samples of data in lower sampling rate of 10 s?
Thank you!
Best,
Chin-Jen
|
Beta Was this translation helpful? Give feedback.
-
Thanks for the insight on Silixa Ultima. Indeed, integration time is better term than sampling rate.
I will post an issue of resample later when I figure out how to do.
Thank you!!
Chin-Jen
From: Bart Schilperoort ***@***.***
Sent: Thursday, July 25, 2024 4:38 PM
To: dtscalibration/python-dts-calibration ***@***.***>
Cc: chinjenlin ***@***.***>; Comment ***@***.***>
Subject: Re: [dtscalibration/python-dts-calibration] Welcome to the discussion page! (Discussion #216)
Ah, that's annoying. Can you open an issue here describing the error and problem? https://github.com/dtscalibration/python-dts-calibration/issues
The Silixa Ultima-S unit I used to use had about a 0.1-0.3 second processing time (depending on how full the hard drive was...). I believe there are many laser pulses per second, but the result is accumulated so "integration time" might be a better name than "sampling rate". The specified time on the machine represents how long the machine measures the fiber.
A 20s integration time or 10s integration time should barely have any overhead. Only once you get down below 5s integration times will the processing overhead become a small issue.
However, my experience is based on quite an old machine, so things might have changed in the meantime.
—
Reply to this email directly, view it on GitHub <#216 (reply in thread)> , or unsubscribe <https://github.com/notifications/unsubscribe-auth/ACZZJFRA2JVYSFBHHFFVPDLZOC2MVAVCNFSM6AAAAABGLBLMAOVHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTAMJUGY2DSNQ> .
You are receiving this because you commented. <https://github.com/notifications/beacon/ACZZJFTAFK66TSNX6CJ7GH3ZOC2MVA5CNFSM6AAAAABGLBLMAOWGG33NNVSW45C7OR4XAZNRIRUXGY3VONZWS33OINXW23LFNZ2KUY3PNVWWK3TUL5UWJTQATLJMA.gif> Message ID: ***@***.*** ***@***.***> >
|
Beta Was this translation helpful? Give feedback.
-
Hi Bart,
The resample function show below works to me. It also considers Xarray transpose issue.
ds = ds.resample(time = "60S").mean().transpose()
Thank you!
Best,
Chin-Jen
From: Bart Schilperoort ***@***.***
Sent: Thursday, July 25, 2024 1:45 PM
To: dtscalibration/python-dts-calibration ***@***.***>
Cc: chinjenlin ***@***.***>; Comment ***@***.***>
Subject: Re: [dtscalibration/python-dts-calibration] Welcome to the discussion page! (Discussion #216)
Hi Chin-Jen,
Something like this should work (didn't try it myself):
from dtscalibration import read_silixa_files
ds = read_silixa_files(directory=filepath, timezone_netcdf="UTC", file_ext="*.xml")
ds = ds.resample(time = "60S")
ds = ds.dts.calibration_single_ended(...)
If that doesn't work, try doing the following for resampling:
ds = ds.resample(time="60S").mean(dim="time", keep_attrs=True)
But in general I would recommend first calibrating, then resampling. This way you would get the most accurate results. Additionally, sampling the fiber at a slightly higher frequency than needed for your application is also better, this will provide more information to the calibration and uncertainty estimation routines.
—
Reply to this email directly, view it on GitHub <#216 (reply in thread)> , or unsubscribe <https://github.com/notifications/unsubscribe-auth/ACZZJFUR7RN54K3R3PK52L3ZOCGD5AVCNFSM6AAAAABGLBLMAOVHI2DSMVQWIX3LMV43URDJONRXK43TNFXW4Q3PNVWWK3TUHMYTAMJUGQ4DQNI> .
You are receiving this because you commented. <https://github.com/notifications/beacon/ACZZJFQUY56V5GSC3HP2ICDZOCGD5A5CNFSM6AAAAABGLBLMAOWGG33NNVSW45C7OR4XAZNRIRUXGY3VONZWS33OINXW23LFNZ2KUY3PNVWWK3TUL5UWJTQATLGHK.gif> Message ID: ***@***.*** ***@***.***> >
|
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
👋 Welcome!
We’re using Discussions as a place to connect with other members of our community. We hope that you:
build together 💪.
To get started, comment below with an introduction of yourself and tell us about what you do with this community.
Beta Was this translation helpful? Give feedback.
All reactions