Launching Multiple Queries With Bash Scriptparallel processing using xargsSave password in bash script for...

Is this homebrew "Faerie Fire Grenade" unbalanced?

What are ways to record who took the pictures if a camera is used by multiple people?

Rapid change in character

Was it illegal to blaspheme God in Antioch in 360.-410.?

What is this "opened" cube called?

Unexpected behavior after assignment of function object to function wrapper

Calculate Landau's function

Can UV radiation be safe for the skin?

Is the word 'mistake' a concrete or abstract noun?

Did the Apollo Guidance Computer really use 60% of the world's ICs in 1963?

How were US credit cards verified in-store in the 1980's?

New coworker has strange workplace requirements - how should I deal with them?

What is a "hashed transaction" in SQL Server Replication terminology?

Storing milk for long periods of time

Why do presidential pardons exist in a country having a clear separation of powers?

Why haven't the British protested Brexit as ardently like Hong Kongers protest?

Lob Logical Read and lob read-ahead reads in NCCI

Are sweatpants frowned upon on flights?

Which language is the closest lexically to Spanish?

Is Borg adaptation only temporary?

Turning an Abelian Group into a Vector Space

How to animate a function plot

Can authors email you PDFs of their textbook for free?

Is "prohibition against," a double negative?



Launching Multiple Queries With Bash Script


parallel processing using xargsSave password in bash script for multiple queriescURL download with multiple rangescurl download multiple files with brace syntaxHow do I decode a list of base64-encoded file names?Mysql queries from bash filecurl with variables on bash scriptHow to access http server from bash script with existing tcp connection?Wireshark not showing HTTP queries






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ margin-bottom:0;
}







1















I have this script (based on this post) to check the http-status of domains list (000, 418, 404 etc) and i would like to improve:



#!/bin/bash
# change path to your url list

yourlist=$(pwd)/url-list.txt

advance=$(cat advance.txt 2>/dev/null || echo 1)
while read LINE; do
curl -o /dev/null --silent --head --write-out '%{http_code}' "$LINE"
echo " $LINE"
advance=$(($advance+1))
echo $advance > advance.txt
done < <(tail -n +$advance $yourlist) >> out


infile:



amazon.com
google.com


outfile:



301 google.com
301 amazon.com


If the script is interrupted for any reason, it starts from the last processed line.



Problem: It is very slow since it verifies line by line.



Tests: I have already tried other alternatives, such as fping (very limited given the size of the list), pyfunceble (freezes PC), wget (doesn't do what i expect), GNU parallel, etc etc. None has convinced me.



Without test: A user recommended this answer based on 'xarg -P' (xargs -P 4 -L 1 -n 1), but I don't know how to implement it because I don't have knowledge about xarg. I also found another similar answer in bash, but it hasn't worked so far.



Question: How can I launch multiple queries (parallel processing) with my script so that I could process many lines at the same time (if it would be possible to set the number of lines to be processed manually, avoiding freeze or blocking the script or PC)?










share|improve this question



























  • Probably helpful: correct xargs parallel usage and parallel processing using xargs and similar questions.

    – Kusalananda
    5 hours ago













  • @Kusalananda very interesting. checking. thk

    – ajcg
    4 hours ago











  • @Kusalananda It would be something like enclosing my script inside xarg: xargs -P 4 -L 1 sh -c '<my script curl> $0 || exit 255' < url-list.txt > etc. But I have no idea. I know little about xarg

    – ajcg
    4 hours ago











  • If any invocation of the command exits with a status of 255, xargs will stop immediately without reading any further input. An error message is issued on stderr when this happens. linux.die.net/man/1/xargs

    – ajcg
    4 hours ago


















1















I have this script (based on this post) to check the http-status of domains list (000, 418, 404 etc) and i would like to improve:



#!/bin/bash
# change path to your url list

yourlist=$(pwd)/url-list.txt

advance=$(cat advance.txt 2>/dev/null || echo 1)
while read LINE; do
curl -o /dev/null --silent --head --write-out '%{http_code}' "$LINE"
echo " $LINE"
advance=$(($advance+1))
echo $advance > advance.txt
done < <(tail -n +$advance $yourlist) >> out


infile:



amazon.com
google.com


outfile:



301 google.com
301 amazon.com


If the script is interrupted for any reason, it starts from the last processed line.



Problem: It is very slow since it verifies line by line.



Tests: I have already tried other alternatives, such as fping (very limited given the size of the list), pyfunceble (freezes PC), wget (doesn't do what i expect), GNU parallel, etc etc. None has convinced me.



Without test: A user recommended this answer based on 'xarg -P' (xargs -P 4 -L 1 -n 1), but I don't know how to implement it because I don't have knowledge about xarg. I also found another similar answer in bash, but it hasn't worked so far.



Question: How can I launch multiple queries (parallel processing) with my script so that I could process many lines at the same time (if it would be possible to set the number of lines to be processed manually, avoiding freeze or blocking the script or PC)?










share|improve this question



























  • Probably helpful: correct xargs parallel usage and parallel processing using xargs and similar questions.

    – Kusalananda
    5 hours ago













  • @Kusalananda very interesting. checking. thk

    – ajcg
    4 hours ago











  • @Kusalananda It would be something like enclosing my script inside xarg: xargs -P 4 -L 1 sh -c '<my script curl> $0 || exit 255' < url-list.txt > etc. But I have no idea. I know little about xarg

    – ajcg
    4 hours ago











  • If any invocation of the command exits with a status of 255, xargs will stop immediately without reading any further input. An error message is issued on stderr when this happens. linux.die.net/man/1/xargs

    – ajcg
    4 hours ago














1












1








1








I have this script (based on this post) to check the http-status of domains list (000, 418, 404 etc) and i would like to improve:



#!/bin/bash
# change path to your url list

yourlist=$(pwd)/url-list.txt

advance=$(cat advance.txt 2>/dev/null || echo 1)
while read LINE; do
curl -o /dev/null --silent --head --write-out '%{http_code}' "$LINE"
echo " $LINE"
advance=$(($advance+1))
echo $advance > advance.txt
done < <(tail -n +$advance $yourlist) >> out


infile:



amazon.com
google.com


outfile:



301 google.com
301 amazon.com


If the script is interrupted for any reason, it starts from the last processed line.



Problem: It is very slow since it verifies line by line.



Tests: I have already tried other alternatives, such as fping (very limited given the size of the list), pyfunceble (freezes PC), wget (doesn't do what i expect), GNU parallel, etc etc. None has convinced me.



Without test: A user recommended this answer based on 'xarg -P' (xargs -P 4 -L 1 -n 1), but I don't know how to implement it because I don't have knowledge about xarg. I also found another similar answer in bash, but it hasn't worked so far.



Question: How can I launch multiple queries (parallel processing) with my script so that I could process many lines at the same time (if it would be possible to set the number of lines to be processed manually, avoiding freeze or blocking the script or PC)?










share|improve this question
















I have this script (based on this post) to check the http-status of domains list (000, 418, 404 etc) and i would like to improve:



#!/bin/bash
# change path to your url list

yourlist=$(pwd)/url-list.txt

advance=$(cat advance.txt 2>/dev/null || echo 1)
while read LINE; do
curl -o /dev/null --silent --head --write-out '%{http_code}' "$LINE"
echo " $LINE"
advance=$(($advance+1))
echo $advance > advance.txt
done < <(tail -n +$advance $yourlist) >> out


infile:



amazon.com
google.com


outfile:



301 google.com
301 amazon.com


If the script is interrupted for any reason, it starts from the last processed line.



Problem: It is very slow since it verifies line by line.



Tests: I have already tried other alternatives, such as fping (very limited given the size of the list), pyfunceble (freezes PC), wget (doesn't do what i expect), GNU parallel, etc etc. None has convinced me.



Without test: A user recommended this answer based on 'xarg -P' (xargs -P 4 -L 1 -n 1), but I don't know how to implement it because I don't have knowledge about xarg. I also found another similar answer in bash, but it hasn't worked so far.



Question: How can I launch multiple queries (parallel processing) with my script so that I could process many lines at the same time (if it would be possible to set the number of lines to be processed manually, avoiding freeze or blocking the script or PC)?







bash curl http






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited 3 hours ago







ajcg

















asked 5 hours ago









ajcgajcg

2924 silver badges17 bronze badges




2924 silver badges17 bronze badges
















  • Probably helpful: correct xargs parallel usage and parallel processing using xargs and similar questions.

    – Kusalananda
    5 hours ago













  • @Kusalananda very interesting. checking. thk

    – ajcg
    4 hours ago











  • @Kusalananda It would be something like enclosing my script inside xarg: xargs -P 4 -L 1 sh -c '<my script curl> $0 || exit 255' < url-list.txt > etc. But I have no idea. I know little about xarg

    – ajcg
    4 hours ago











  • If any invocation of the command exits with a status of 255, xargs will stop immediately without reading any further input. An error message is issued on stderr when this happens. linux.die.net/man/1/xargs

    – ajcg
    4 hours ago



















  • Probably helpful: correct xargs parallel usage and parallel processing using xargs and similar questions.

    – Kusalananda
    5 hours ago













  • @Kusalananda very interesting. checking. thk

    – ajcg
    4 hours ago











  • @Kusalananda It would be something like enclosing my script inside xarg: xargs -P 4 -L 1 sh -c '<my script curl> $0 || exit 255' < url-list.txt > etc. But I have no idea. I know little about xarg

    – ajcg
    4 hours ago











  • If any invocation of the command exits with a status of 255, xargs will stop immediately without reading any further input. An error message is issued on stderr when this happens. linux.die.net/man/1/xargs

    – ajcg
    4 hours ago

















Probably helpful: correct xargs parallel usage and parallel processing using xargs and similar questions.

– Kusalananda
5 hours ago







Probably helpful: correct xargs parallel usage and parallel processing using xargs and similar questions.

– Kusalananda
5 hours ago















@Kusalananda very interesting. checking. thk

– ajcg
4 hours ago





@Kusalananda very interesting. checking. thk

– ajcg
4 hours ago













@Kusalananda It would be something like enclosing my script inside xarg: xargs -P 4 -L 1 sh -c '<my script curl> $0 || exit 255' < url-list.txt > etc. But I have no idea. I know little about xarg

– ajcg
4 hours ago





@Kusalananda It would be something like enclosing my script inside xarg: xargs -P 4 -L 1 sh -c '<my script curl> $0 || exit 255' < url-list.txt > etc. But I have no idea. I know little about xarg

– ajcg
4 hours ago













If any invocation of the command exits with a status of 255, xargs will stop immediately without reading any further input. An error message is issued on stderr when this happens. linux.die.net/man/1/xargs

– ajcg
4 hours ago





If any invocation of the command exits with a status of 255, xargs will stop immediately without reading any further input. An error message is issued on stderr when this happens. linux.die.net/man/1/xargs

– ajcg
4 hours ago










1 Answer
1






active

oldest

votes


















0















you could split up your url list in e.g. 10 parts and use a main script ala



./subscript1.sh &
./subscript2.sh &
...
./subscript10.sh &


to run it in parallel. Caution: They should use distinct log files.






share|improve this answer




























    Your Answer








    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "106"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: false,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f538340%2flaunching-multiple-queries-with-bash-script%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    0















    you could split up your url list in e.g. 10 parts and use a main script ala



    ./subscript1.sh &
    ./subscript2.sh &
    ...
    ./subscript10.sh &


    to run it in parallel. Caution: They should use distinct log files.






    share|improve this answer






























      0















      you could split up your url list in e.g. 10 parts and use a main script ala



      ./subscript1.sh &
      ./subscript2.sh &
      ...
      ./subscript10.sh &


      to run it in parallel. Caution: They should use distinct log files.






      share|improve this answer




























        0














        0










        0









        you could split up your url list in e.g. 10 parts and use a main script ala



        ./subscript1.sh &
        ./subscript2.sh &
        ...
        ./subscript10.sh &


        to run it in parallel. Caution: They should use distinct log files.






        share|improve this answer













        you could split up your url list in e.g. 10 parts and use a main script ala



        ./subscript1.sh &
        ./subscript2.sh &
        ...
        ./subscript10.sh &


        to run it in parallel. Caution: They should use distinct log files.







        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered 3 hours ago









        mifritschermifritscher

        1718 bronze badges




        1718 bronze badges

































            draft saved

            draft discarded




















































            Thanks for contributing an answer to Unix & Linux Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f538340%2flaunching-multiple-queries-with-bash-script%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Taj Mahal Inhaltsverzeichnis Aufbau | Geschichte | 350-Jahr-Feier | Heutige Bedeutung | Siehe auch |...

            Baia Sprie Cuprins Etimologie | Istorie | Demografie | Politică și administrație | Arii naturale...

            Nicolae Petrescu-Găină Cuprins Biografie | Opera | In memoriam | Varia | Controverse, incertitudini...