/usr/bin/truncate: Argument list too longfind + xargs: argument line too longSolving “mv: Argument list too...
Can a giant mushroom be used as a material to build watercraft or sailing ships?
Why is the UK so keen to remove the "backstop" when their leadership seems to think that no border will be needed in Northern Ireland?
How can I unambiguously ask for a new user's "Display Name"?
Papers on arXiv solving the same problem at the same time
"Opusculum hoc, quamdiu vixero, doctioribus emendandum offero."?
How does the OS tell whether an "Address is already in use"?
Is gzip atomic?
Why do banks “park” their money at the European Central Bank?
Talk interpreter
Where does learning new skills fit into Agile?
The Wires Underground
How long do you think advanced cybernetic implants would plausibly last?
Was the Boeing 2707 design flawed?
Does this VCO produce a sine wave or square wave
Can RMSE and MAE have the same value?
“T” in subscript in formulas
Can an ISO file damage—or infect—the machine it's being burned on?
Where can/should I, as a high schooler, publish a paper regarding the derivation of a formula?
Does maintaining a spell with a longer casting time count as casting a spell?
Why is there a difference between predicting on Validation set and Test set?
Do Bayesian credible intervals treat the estimated parameter as a random variable?
How many lines of code does the original TeX contain?
How do I make my image comply with the requirements of this photography competition?
I don't have the theoretical background in my PhD topic. I can't justify getting the degree
/usr/bin/truncate: Argument list too long
find + xargs: argument line too longSolving “mv: Argument list too long”?bash: /usr/bin/perl: Argument list too long/usr/bin/awk: Argument list too long/bin/ls: Argument list too longArgument list too longMoving random files using shuf and mv - Argument list too longArgument list too long when zipping large list of certain files in a folderSolution for Argument list too long
.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ margin-bottom:0;
}
I want to use the truncate
command to create a huge number of small files for testing. I tried the command with a small number of files (100) and it worked. When I changed the number to 1000000, it reports an error:
root:[~/data]# truncate -s 1k {1..100}
root:[~/data]# rm -rf *
root:[~/data]# truncate -s 1k {1..1000000}
-bash: /usr/bin/truncate: Argument list too long
root:[~/data]#
How can I solve it? I have a sense that xargs
could be used, but I can't make it work.
bash files xargs arguments truncate
add a comment |
I want to use the truncate
command to create a huge number of small files for testing. I tried the command with a small number of files (100) and it worked. When I changed the number to 1000000, it reports an error:
root:[~/data]# truncate -s 1k {1..100}
root:[~/data]# rm -rf *
root:[~/data]# truncate -s 1k {1..1000000}
-bash: /usr/bin/truncate: Argument list too long
root:[~/data]#
How can I solve it? I have a sense that xargs
could be used, but I can't make it work.
bash files xargs arguments truncate
2
seq 1 10000000 | xargs truncate -s 1k
– mosvy
7 hours ago
1
What design decision led to you needing a million files in the same directory? Ugh!
– roaima
7 hours ago
I want to benchmark different ways of deleting these files.
– Just a learner
7 hours ago
add a comment |
I want to use the truncate
command to create a huge number of small files for testing. I tried the command with a small number of files (100) and it worked. When I changed the number to 1000000, it reports an error:
root:[~/data]# truncate -s 1k {1..100}
root:[~/data]# rm -rf *
root:[~/data]# truncate -s 1k {1..1000000}
-bash: /usr/bin/truncate: Argument list too long
root:[~/data]#
How can I solve it? I have a sense that xargs
could be used, but I can't make it work.
bash files xargs arguments truncate
I want to use the truncate
command to create a huge number of small files for testing. I tried the command with a small number of files (100) and it worked. When I changed the number to 1000000, it reports an error:
root:[~/data]# truncate -s 1k {1..100}
root:[~/data]# rm -rf *
root:[~/data]# truncate -s 1k {1..1000000}
-bash: /usr/bin/truncate: Argument list too long
root:[~/data]#
How can I solve it? I have a sense that xargs
could be used, but I can't make it work.
bash files xargs arguments truncate
bash files xargs arguments truncate
edited 6 hours ago
Jeff Schaller♦
49.2k11 gold badges72 silver badges163 bronze badges
49.2k11 gold badges72 silver badges163 bronze badges
asked 7 hours ago
Just a learnerJust a learner
7321 gold badge6 silver badges17 bronze badges
7321 gold badge6 silver badges17 bronze badges
2
seq 1 10000000 | xargs truncate -s 1k
– mosvy
7 hours ago
1
What design decision led to you needing a million files in the same directory? Ugh!
– roaima
7 hours ago
I want to benchmark different ways of deleting these files.
– Just a learner
7 hours ago
add a comment |
2
seq 1 10000000 | xargs truncate -s 1k
– mosvy
7 hours ago
1
What design decision led to you needing a million files in the same directory? Ugh!
– roaima
7 hours ago
I want to benchmark different ways of deleting these files.
– Just a learner
7 hours ago
2
2
seq 1 10000000 | xargs truncate -s 1k
– mosvy
7 hours ago
seq 1 10000000 | xargs truncate -s 1k
– mosvy
7 hours ago
1
1
What design decision led to you needing a million files in the same directory? Ugh!
– roaima
7 hours ago
What design decision led to you needing a million files in the same directory? Ugh!
– roaima
7 hours ago
I want to benchmark different ways of deleting these files.
– Just a learner
7 hours ago
I want to benchmark different ways of deleting these files.
– Just a learner
7 hours ago
add a comment |
1 Answer
1
active
oldest
votes
You could do
echo {1..1000000} | xargs truncate -s 1k
(That should work with a shell with a builtin echo that isn't subject to command line length limits. Also xargs
splits the input on any whitespace by default, but that doesn't matter here.)
The above might use awful amounts of memory, so using seq
like in mosvy's comment might be better:
seq 1 1000000 | xargs truncate -s 1k
I usually use a loop though (this starts from 000000
, not 1
, and has leading zeroes in all names):
for i in {000..999}; do
touch "$i"{000..999}
done
Having a million files in a single directory is probably going to be slow, so unless you're testing just that, it might be a good idea to spread them into subdirectories instead, say:
for i in {000..999}; do
mkdir "$i"
touch "$i"/{000..999}
done
Note that if you can't create the files because they don't fit on one command line and work around that somehow, you probably won't be able to remove them with rm -f *
either. You'd need to remove the whole tree recursively, or do something like find -maxdepth 1 -type f -delete
.
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "106"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f537373%2fusr-bin-truncate-argument-list-too-long%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
1 Answer
1
active
oldest
votes
1 Answer
1
active
oldest
votes
active
oldest
votes
active
oldest
votes
You could do
echo {1..1000000} | xargs truncate -s 1k
(That should work with a shell with a builtin echo that isn't subject to command line length limits. Also xargs
splits the input on any whitespace by default, but that doesn't matter here.)
The above might use awful amounts of memory, so using seq
like in mosvy's comment might be better:
seq 1 1000000 | xargs truncate -s 1k
I usually use a loop though (this starts from 000000
, not 1
, and has leading zeroes in all names):
for i in {000..999}; do
touch "$i"{000..999}
done
Having a million files in a single directory is probably going to be slow, so unless you're testing just that, it might be a good idea to spread them into subdirectories instead, say:
for i in {000..999}; do
mkdir "$i"
touch "$i"/{000..999}
done
Note that if you can't create the files because they don't fit on one command line and work around that somehow, you probably won't be able to remove them with rm -f *
either. You'd need to remove the whole tree recursively, or do something like find -maxdepth 1 -type f -delete
.
add a comment |
You could do
echo {1..1000000} | xargs truncate -s 1k
(That should work with a shell with a builtin echo that isn't subject to command line length limits. Also xargs
splits the input on any whitespace by default, but that doesn't matter here.)
The above might use awful amounts of memory, so using seq
like in mosvy's comment might be better:
seq 1 1000000 | xargs truncate -s 1k
I usually use a loop though (this starts from 000000
, not 1
, and has leading zeroes in all names):
for i in {000..999}; do
touch "$i"{000..999}
done
Having a million files in a single directory is probably going to be slow, so unless you're testing just that, it might be a good idea to spread them into subdirectories instead, say:
for i in {000..999}; do
mkdir "$i"
touch "$i"/{000..999}
done
Note that if you can't create the files because they don't fit on one command line and work around that somehow, you probably won't be able to remove them with rm -f *
either. You'd need to remove the whole tree recursively, or do something like find -maxdepth 1 -type f -delete
.
add a comment |
You could do
echo {1..1000000} | xargs truncate -s 1k
(That should work with a shell with a builtin echo that isn't subject to command line length limits. Also xargs
splits the input on any whitespace by default, but that doesn't matter here.)
The above might use awful amounts of memory, so using seq
like in mosvy's comment might be better:
seq 1 1000000 | xargs truncate -s 1k
I usually use a loop though (this starts from 000000
, not 1
, and has leading zeroes in all names):
for i in {000..999}; do
touch "$i"{000..999}
done
Having a million files in a single directory is probably going to be slow, so unless you're testing just that, it might be a good idea to spread them into subdirectories instead, say:
for i in {000..999}; do
mkdir "$i"
touch "$i"/{000..999}
done
Note that if you can't create the files because they don't fit on one command line and work around that somehow, you probably won't be able to remove them with rm -f *
either. You'd need to remove the whole tree recursively, or do something like find -maxdepth 1 -type f -delete
.
You could do
echo {1..1000000} | xargs truncate -s 1k
(That should work with a shell with a builtin echo that isn't subject to command line length limits. Also xargs
splits the input on any whitespace by default, but that doesn't matter here.)
The above might use awful amounts of memory, so using seq
like in mosvy's comment might be better:
seq 1 1000000 | xargs truncate -s 1k
I usually use a loop though (this starts from 000000
, not 1
, and has leading zeroes in all names):
for i in {000..999}; do
touch "$i"{000..999}
done
Having a million files in a single directory is probably going to be slow, so unless you're testing just that, it might be a good idea to spread them into subdirectories instead, say:
for i in {000..999}; do
mkdir "$i"
touch "$i"/{000..999}
done
Note that if you can't create the files because they don't fit on one command line and work around that somehow, you probably won't be able to remove them with rm -f *
either. You'd need to remove the whole tree recursively, or do something like find -maxdepth 1 -type f -delete
.
edited 7 hours ago
answered 7 hours ago
ilkkachuilkkachu
67.6k10 gold badges112 silver badges193 bronze badges
67.6k10 gold badges112 silver badges193 bronze badges
add a comment |
add a comment |
Thanks for contributing an answer to Unix & Linux Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f537373%2fusr-bin-truncate-argument-list-too-long%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
2
seq 1 10000000 | xargs truncate -s 1k
– mosvy
7 hours ago
1
What design decision led to you needing a million files in the same directory? Ugh!
– roaima
7 hours ago
I want to benchmark different ways of deleting these files.
– Just a learner
7 hours ago