Developing Backend Scheduled App
At this point you have the app template copied by Corva app generator to your environment and your app shell created in Dev Center. Next you need to write your logic.
First of all let’s take a look at some data we are going to use in our app.
Go to Dev Center and go to Datasets explorer.
Find a completion.wits collection under Corva. You can choose an asset that has real-time data via drop-down menu or leave it empty.
Choose an asset
Every event is a record in the collection. There are some common objects for real-time data collections such as provider, collection, timestamp, asset_id, company_id and all the metadata.
In this example we’re going to fetch wellhead_pressure and slurry_flow_rate_in from the data object of the collection.
Now you know what data looks like.
Go to lambda_function.py and write logic to process the data:
from corva import Api, Cache, Logger, ScheduledNaturalTimeEvent, scheduled
@scheduled
def lambda_handler(event: ScheduledNaturalTimeEvent, api: Api, cache: Cache):
""" Test app """
Logger.info(f'Start')
lambda_function.py logic
Request data from a corva collection
data = api.get_dataset(
provider='corva',
dataset='completion.wits',
query={
'asset_id': event.asset_id,
},
sort={'timestamp': 1},
limit=1,
fields='data,timestamp',
)
Logger.info(f'{data=}')
# Take first record's data
timestamp = data[0]["timestamp"]
wellhead_pressure = data[0]["data"]["wellhead_pressure"]
slurry_flow_rate_in = data[0]["data"]["slurry_flow_rate_in"]
# compose data for our collection
body = {
"asset_id": event.asset_id,
"version": 1,
"timestamp": timestamp,
"company_id": event.company_id,
"data": {
"wellhead_pressure": wellhead_pressure,
"slurry_flow_rate_in": slurry_flow_rate_in
}
}
Logger.info(f'{body=}')
api.post('/api/v1/data/sample/my-completion-dataset/', data=[body]).raise_for_status()
Logger.info('Done!')
Request to get data from Corva collection
Next you need to deploy your app to Dev Center
More info about Corva API and Corva Data API: API Tutorial