Help parsing JSON stream into Python with BASH and jqParse JSON using Python?piping python variable value to...
Any Issues with running Workbench 1.3 on a Kickstart 1.2 Amiga 500?
What is "demographic engineering" and how does it differ from ethnic cleansing?
"A tin of biscuits" vs "A biscuit tin"
Can we not simply connect a battery to a RAM to prevent data loss during power cuts?
Which culture used no personal names?
How to print and use a command output in a one-liner?
In the comics, have any of the Robins called their costume "target attraction" for villains?
When did 5 foot squares become standard in D&D?
Why do Computer Science degrees contain a high proportion of mathematics?
Has there been a Kraken patron for the Warlock class in Unearthed Arcana?
How to respond to "Why didn't you do a postdoc after your PhD?"
How should the 23rd judge vote?
String Format object extension
Front hydraulic disk brake is too powerful on MTB — solutions?
How to treat unhandled exceptions? (Terminate the application vs. Keep it alive)
How to plot two axis bar chart side by side
How to create a vimrc macro using :sort?
Slaad Chaos Phage: Weak Combat Ability?
What could possibly power an Alcubierre drive?
Why exactly is the answer 50 ohms?
Is Schrodinger's Cat itself an observer?
Why does Principal Vagina say, "no relation" after introducing himself?
How to protect my Wi-Fi password from being displayed by Android phones when sharing it with QR code?
What does すきすき mean here?
Help parsing JSON stream into Python with BASH and jq
Parse JSON using Python?piping python variable value to bash script (inside python script)Need script to kill python process with low CPU usageHow do I use sed to change },{“Foo” to }n{“Foo”?Bash - loop through files and pull JSON value from corresponding keys in a master key fileParsing JSON with JQConvert json file to a “key-path” with the resulting value at the end of each “key-path”Cronjob fetching JSON with cURL. Check if response isn't empty before writing to file
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{
margin-bottom:0;
}
I'm decent when it comes to writing python code but BASH is fairly new to me and I'd like some guidance on picking a high level approach to achieving my goal of reading sensor data from a live streaming JSON file and processing the data in python in real time.
The setup:
I have a system running Ubuntu 18.04. This system is connected to a peripheral device which reads some sensors and streams the readings to a JSON file. In order to begin the stream I've written a simple bash script which creates a directory and starts the service.
sudo service my_service stop
if [ -d $LOGDIR ]; then
echo "All data in ws_log will be erased. Continue? [Y/n]: "
read ans
if [ "${ans,,}" = "y" ]; then
sudo rm -Rf $LOGDIR/*
echo Directory cleared.
else
echo Exiting Script
exit 1
fi
else
mkdir $LOGDIR
echo $LOGDIR Successfully created!
fi
sudo $SERVDIR/my_service --recording-directory $LOGDIR
Once the service is running it prints a message to the terminal telling the user to press Ctrl+c when they are finished logging (so this service is blocking the terminal until it receives SIGINT).
I'm experimenting with using the screen command to run this in a detached screen and passing commands to it via the python subprocess module. However this seems complicated and I'm not sure the screen command is the best tool for the job here. Thoughts?
Next question:
Each object in the JSON file is a packet containing 6 samples and a timestamp. The streaming JSON file is being appended to 20 times per second. The JSON file starts with '[' then has comma delimited JSON objects. The issue is, because it is streaming, it does not have an ending ']', it ends with a comma.
So far I came up with this command which uses sed to replace the last comma in the JSON file with a ']' so that jq is able to recognize the JSON objects and writes them out formatted as one object per line.
sed "$ s/,/]/g" log.json | jq -c '.[]' | tail -F >> simpler_log.txt
However, I want this line to run continuously until my python or bash script tells it to stop. Ideally I can use tail -f 's tracking to prevent from making duplicate reads of the same line. How should I fix that? or should I be doing this part differently?
Final Question:
How would you suggest I push/pull this data into my python script?
I figure I can just count the number of lines I've already parsed and make subsequent reads like so:
pseudo code: json.loads(log_file[last_rec_line+1:-1]
But I'm open to hearing any suggestions you may have for how to get the data into python efficiently.
I know there is a lot here, I really appreciate any help you may have to offer!
ubuntu python json streaming jq
New contributor
add a comment
|
I'm decent when it comes to writing python code but BASH is fairly new to me and I'd like some guidance on picking a high level approach to achieving my goal of reading sensor data from a live streaming JSON file and processing the data in python in real time.
The setup:
I have a system running Ubuntu 18.04. This system is connected to a peripheral device which reads some sensors and streams the readings to a JSON file. In order to begin the stream I've written a simple bash script which creates a directory and starts the service.
sudo service my_service stop
if [ -d $LOGDIR ]; then
echo "All data in ws_log will be erased. Continue? [Y/n]: "
read ans
if [ "${ans,,}" = "y" ]; then
sudo rm -Rf $LOGDIR/*
echo Directory cleared.
else
echo Exiting Script
exit 1
fi
else
mkdir $LOGDIR
echo $LOGDIR Successfully created!
fi
sudo $SERVDIR/my_service --recording-directory $LOGDIR
Once the service is running it prints a message to the terminal telling the user to press Ctrl+c when they are finished logging (so this service is blocking the terminal until it receives SIGINT).
I'm experimenting with using the screen command to run this in a detached screen and passing commands to it via the python subprocess module. However this seems complicated and I'm not sure the screen command is the best tool for the job here. Thoughts?
Next question:
Each object in the JSON file is a packet containing 6 samples and a timestamp. The streaming JSON file is being appended to 20 times per second. The JSON file starts with '[' then has comma delimited JSON objects. The issue is, because it is streaming, it does not have an ending ']', it ends with a comma.
So far I came up with this command which uses sed to replace the last comma in the JSON file with a ']' so that jq is able to recognize the JSON objects and writes them out formatted as one object per line.
sed "$ s/,/]/g" log.json | jq -c '.[]' | tail -F >> simpler_log.txt
However, I want this line to run continuously until my python or bash script tells it to stop. Ideally I can use tail -f 's tracking to prevent from making duplicate reads of the same line. How should I fix that? or should I be doing this part differently?
Final Question:
How would you suggest I push/pull this data into my python script?
I figure I can just count the number of lines I've already parsed and make subsequent reads like so:
pseudo code: json.loads(log_file[last_rec_line+1:-1]
But I'm open to hearing any suggestions you may have for how to get the data into python efficiently.
I know there is a lot here, I really appreciate any help you may have to offer!
ubuntu python json streaming jq
New contributor
Partial answer, use tail -F to read log.json, and pipe that into sed rather than let sed read the file. The final|tail -F
doesn't obviously do anything useful. You might need your sed to delete the first[
and to wrap each line in[
and]
, but at that point I am not sure if thejq
is adding any value. Can you show some typical lines from log.json and simpler_log.txt?
– icarus
8 mins ago
add a comment
|
I'm decent when it comes to writing python code but BASH is fairly new to me and I'd like some guidance on picking a high level approach to achieving my goal of reading sensor data from a live streaming JSON file and processing the data in python in real time.
The setup:
I have a system running Ubuntu 18.04. This system is connected to a peripheral device which reads some sensors and streams the readings to a JSON file. In order to begin the stream I've written a simple bash script which creates a directory and starts the service.
sudo service my_service stop
if [ -d $LOGDIR ]; then
echo "All data in ws_log will be erased. Continue? [Y/n]: "
read ans
if [ "${ans,,}" = "y" ]; then
sudo rm -Rf $LOGDIR/*
echo Directory cleared.
else
echo Exiting Script
exit 1
fi
else
mkdir $LOGDIR
echo $LOGDIR Successfully created!
fi
sudo $SERVDIR/my_service --recording-directory $LOGDIR
Once the service is running it prints a message to the terminal telling the user to press Ctrl+c when they are finished logging (so this service is blocking the terminal until it receives SIGINT).
I'm experimenting with using the screen command to run this in a detached screen and passing commands to it via the python subprocess module. However this seems complicated and I'm not sure the screen command is the best tool for the job here. Thoughts?
Next question:
Each object in the JSON file is a packet containing 6 samples and a timestamp. The streaming JSON file is being appended to 20 times per second. The JSON file starts with '[' then has comma delimited JSON objects. The issue is, because it is streaming, it does not have an ending ']', it ends with a comma.
So far I came up with this command which uses sed to replace the last comma in the JSON file with a ']' so that jq is able to recognize the JSON objects and writes them out formatted as one object per line.
sed "$ s/,/]/g" log.json | jq -c '.[]' | tail -F >> simpler_log.txt
However, I want this line to run continuously until my python or bash script tells it to stop. Ideally I can use tail -f 's tracking to prevent from making duplicate reads of the same line. How should I fix that? or should I be doing this part differently?
Final Question:
How would you suggest I push/pull this data into my python script?
I figure I can just count the number of lines I've already parsed and make subsequent reads like so:
pseudo code: json.loads(log_file[last_rec_line+1:-1]
But I'm open to hearing any suggestions you may have for how to get the data into python efficiently.
I know there is a lot here, I really appreciate any help you may have to offer!
ubuntu python json streaming jq
New contributor
I'm decent when it comes to writing python code but BASH is fairly new to me and I'd like some guidance on picking a high level approach to achieving my goal of reading sensor data from a live streaming JSON file and processing the data in python in real time.
The setup:
I have a system running Ubuntu 18.04. This system is connected to a peripheral device which reads some sensors and streams the readings to a JSON file. In order to begin the stream I've written a simple bash script which creates a directory and starts the service.
sudo service my_service stop
if [ -d $LOGDIR ]; then
echo "All data in ws_log will be erased. Continue? [Y/n]: "
read ans
if [ "${ans,,}" = "y" ]; then
sudo rm -Rf $LOGDIR/*
echo Directory cleared.
else
echo Exiting Script
exit 1
fi
else
mkdir $LOGDIR
echo $LOGDIR Successfully created!
fi
sudo $SERVDIR/my_service --recording-directory $LOGDIR
Once the service is running it prints a message to the terminal telling the user to press Ctrl+c when they are finished logging (so this service is blocking the terminal until it receives SIGINT).
I'm experimenting with using the screen command to run this in a detached screen and passing commands to it via the python subprocess module. However this seems complicated and I'm not sure the screen command is the best tool for the job here. Thoughts?
Next question:
Each object in the JSON file is a packet containing 6 samples and a timestamp. The streaming JSON file is being appended to 20 times per second. The JSON file starts with '[' then has comma delimited JSON objects. The issue is, because it is streaming, it does not have an ending ']', it ends with a comma.
So far I came up with this command which uses sed to replace the last comma in the JSON file with a ']' so that jq is able to recognize the JSON objects and writes them out formatted as one object per line.
sed "$ s/,/]/g" log.json | jq -c '.[]' | tail -F >> simpler_log.txt
However, I want this line to run continuously until my python or bash script tells it to stop. Ideally I can use tail -f 's tracking to prevent from making duplicate reads of the same line. How should I fix that? or should I be doing this part differently?
Final Question:
How would you suggest I push/pull this data into my python script?
I figure I can just count the number of lines I've already parsed and make subsequent reads like so:
pseudo code: json.loads(log_file[last_rec_line+1:-1]
But I'm open to hearing any suggestions you may have for how to get the data into python efficiently.
I know there is a lot here, I really appreciate any help you may have to offer!
ubuntu python json streaming jq
ubuntu python json streaming jq
New contributor
New contributor
New contributor
asked 32 mins ago
Nick FranklinNick Franklin
1
1
New contributor
New contributor
Partial answer, use tail -F to read log.json, and pipe that into sed rather than let sed read the file. The final|tail -F
doesn't obviously do anything useful. You might need your sed to delete the first[
and to wrap each line in[
and]
, but at that point I am not sure if thejq
is adding any value. Can you show some typical lines from log.json and simpler_log.txt?
– icarus
8 mins ago
add a comment
|
Partial answer, use tail -F to read log.json, and pipe that into sed rather than let sed read the file. The final|tail -F
doesn't obviously do anything useful. You might need your sed to delete the first[
and to wrap each line in[
and]
, but at that point I am not sure if thejq
is adding any value. Can you show some typical lines from log.json and simpler_log.txt?
– icarus
8 mins ago
Partial answer, use tail -F to read log.json, and pipe that into sed rather than let sed read the file. The final
|tail -F
doesn't obviously do anything useful. You might need your sed to delete the first [
and to wrap each line in [
and ]
, but at that point I am not sure if the jq
is adding any value. Can you show some typical lines from log.json and simpler_log.txt?– icarus
8 mins ago
Partial answer, use tail -F to read log.json, and pipe that into sed rather than let sed read the file. The final
|tail -F
doesn't obviously do anything useful. You might need your sed to delete the first [
and to wrap each line in [
and ]
, but at that point I am not sure if the jq
is adding any value. Can you show some typical lines from log.json and simpler_log.txt?– icarus
8 mins ago
add a comment
|
0
active
oldest
votes
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "106"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/4.0/"u003ecc by-sa 4.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Nick Franklin is a new contributor. Be nice, and check out our Code of Conduct.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f545623%2fhelp-parsing-json-stream-into-python-with-bash-and-jq%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
0
active
oldest
votes
0
active
oldest
votes
active
oldest
votes
active
oldest
votes
Nick Franklin is a new contributor. Be nice, and check out our Code of Conduct.
Nick Franklin is a new contributor. Be nice, and check out our Code of Conduct.
Nick Franklin is a new contributor. Be nice, and check out our Code of Conduct.
Nick Franklin is a new contributor. Be nice, and check out our Code of Conduct.
Thanks for contributing an answer to Unix & Linux Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f545623%2fhelp-parsing-json-stream-into-python-with-bash-and-jq%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Partial answer, use tail -F to read log.json, and pipe that into sed rather than let sed read the file. The final
|tail -F
doesn't obviously do anything useful. You might need your sed to delete the first[
and to wrap each line in[
and]
, but at that point I am not sure if thejq
is adding any value. Can you show some typical lines from log.json and simpler_log.txt?– icarus
8 mins ago