Download-extract all content of a directory in tarball into an existing directory to override everything...

Why does this Jet Provost strikemaster have a textured leading edge?

Solving pricing problem heuristically in column generation algorithm for VRP

What is axle tramp?

Why aren't rockets built with truss structures inside their fuel & oxidizer tanks to increase structural strength?

Number in overlapping range

How to gracefully leave a company you helped start?

How to get locks that are keyed alike?

Is there any official ruling on how characters go from 0th to 1st level in a class?

Escape Velocity - Won't the orbital path just become larger with higher initial velocity?

What is the opposite of "hunger level"?

How to prevent criminal gangs from making/buying guns?

Bringing Power Supplies on Plane?

Doesn't the speed of light limit imply the same electron can be annihilated twice?

Did Pope Urban II issue the papal bull "terra nullius" in 1095?

What modifiers are added to the attack and damage rolls of this unique longbow from Waterdeep: Dragon Heist?

What should I do if actually I found a serious flaw in someone's PhD thesis and an article derived from that PhD thesis?

Did Michelle Obama have a staff of 23; and Melania have a staff of 4?

Unconventional examples of mathematical modelling

Help, I cannot decide when to start the story

How would armour (and combat) change if the fighter didn't need to actually wear it?

What can I do to increase the amount of LEDs I can power with a pro micro?

Telephone number in spoken words

Scam? Phone call from "Department of Social Security" asking me to call back

Do I need to start off my book by describing the character's "normal world"?



Download-extract all content of a directory in tarball into an existing directory to override everything besides one or more exceptions


want to exclude multiple folders and files when extracting using tarUnderstanding --strip-components -C






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ margin-bottom:0;
}







1















I have a MediaWiki 1.32.0 website which I want to upgrade, hosted on a CentOS "Shared Server" environment.

It is an all-core website with no added extensions, skins and images (besides logo)



To upgrade I need to change generally all files in the website's directory to those inside a directory of a newer version's MediaWiki installment (available inside a tartball) by a general overriding operation.



To download a latest MediaWiki tarball containing such directory (as of 13/08/19) one could execute:



wget https://releases.wikimedia.org/mediawiki/1.33/mediawiki-1.33.0.tar.gz


In my existing website directory, there are these files I already edited and shouldn't override:





  1. LocalSettings.php

  2. robots.txt

  3. .htaccess


  4. example.com.png (logo image)


  5. googlec69e044fede13fdc.html (Google search console verification file)




How could I download, and extract all files from directory in tarball to override a my current MediaWiki directory to override all files besides listed exceptions (such as the files listed above)?
I do plan to backup the old directory before changes manually as priming part of the script; adding a condition to continue only of the backup was done might be a nice idea; all of this, aside from having automatic daily backups).










share|improve this question


















This question has an open bounty worth +100
reputation from JohnDoea ending in 6 days.


This question has not received enough attention.


















  • terdon, isn't lang=bash not working here? isn't stepped instructions be formatted by means of accessibility, splitting a passage into two sub-passages with <br> is wrong to you and I don't know why. I don't understand at least the formatting aspect of the rejection.

    – JohnDoea
    14 hours ago













  • Your edit was making the answer into a list (why?) and using the wrong format for lists (you want 1. not 1)); it added a grammatically incorrect phrase (In suggestion* see:) and then just changed the formatting commands that were fine already. Finally, you added a linebreak for no apparent reason. None of these changes were improving the post. But if you disagree with an edit rejection, please take it to meta. The comments are not the right place to discuss it.

    – terdon
    14 hours ago













  • IMO, steps should generally be in lists but never mind here; I was wrong about 1) as in this particular case 1. was fine indeed. I I don't know why you say this on "in suggestion" to me it's fine. I tried to explain why I think the <br> is good... Thanks anyway for taking my inquiry seriously --- I will not go to meta. I still ask about lang=bash though. Is it "valid" in SE?

    – JohnDoea
    13 hours ago













  • See Implement ```-style (fenced) Markdown code blocks. The right format is lang-bash not lang=bash but the principle is sound, yes.

    – terdon
    13 hours ago




















1















I have a MediaWiki 1.32.0 website which I want to upgrade, hosted on a CentOS "Shared Server" environment.

It is an all-core website with no added extensions, skins and images (besides logo)



To upgrade I need to change generally all files in the website's directory to those inside a directory of a newer version's MediaWiki installment (available inside a tartball) by a general overriding operation.



To download a latest MediaWiki tarball containing such directory (as of 13/08/19) one could execute:



wget https://releases.wikimedia.org/mediawiki/1.33/mediawiki-1.33.0.tar.gz


In my existing website directory, there are these files I already edited and shouldn't override:





  1. LocalSettings.php

  2. robots.txt

  3. .htaccess


  4. example.com.png (logo image)


  5. googlec69e044fede13fdc.html (Google search console verification file)




How could I download, and extract all files from directory in tarball to override a my current MediaWiki directory to override all files besides listed exceptions (such as the files listed above)?
I do plan to backup the old directory before changes manually as priming part of the script; adding a condition to continue only of the backup was done might be a nice idea; all of this, aside from having automatic daily backups).










share|improve this question


















This question has an open bounty worth +100
reputation from JohnDoea ending in 6 days.


This question has not received enough attention.


















  • terdon, isn't lang=bash not working here? isn't stepped instructions be formatted by means of accessibility, splitting a passage into two sub-passages with <br> is wrong to you and I don't know why. I don't understand at least the formatting aspect of the rejection.

    – JohnDoea
    14 hours ago













  • Your edit was making the answer into a list (why?) and using the wrong format for lists (you want 1. not 1)); it added a grammatically incorrect phrase (In suggestion* see:) and then just changed the formatting commands that were fine already. Finally, you added a linebreak for no apparent reason. None of these changes were improving the post. But if you disagree with an edit rejection, please take it to meta. The comments are not the right place to discuss it.

    – terdon
    14 hours ago













  • IMO, steps should generally be in lists but never mind here; I was wrong about 1) as in this particular case 1. was fine indeed. I I don't know why you say this on "in suggestion" to me it's fine. I tried to explain why I think the <br> is good... Thanks anyway for taking my inquiry seriously --- I will not go to meta. I still ask about lang=bash though. Is it "valid" in SE?

    – JohnDoea
    13 hours ago













  • See Implement ```-style (fenced) Markdown code blocks. The right format is lang-bash not lang=bash but the principle is sound, yes.

    – terdon
    13 hours ago
















1












1








1


1






I have a MediaWiki 1.32.0 website which I want to upgrade, hosted on a CentOS "Shared Server" environment.

It is an all-core website with no added extensions, skins and images (besides logo)



To upgrade I need to change generally all files in the website's directory to those inside a directory of a newer version's MediaWiki installment (available inside a tartball) by a general overriding operation.



To download a latest MediaWiki tarball containing such directory (as of 13/08/19) one could execute:



wget https://releases.wikimedia.org/mediawiki/1.33/mediawiki-1.33.0.tar.gz


In my existing website directory, there are these files I already edited and shouldn't override:





  1. LocalSettings.php

  2. robots.txt

  3. .htaccess


  4. example.com.png (logo image)


  5. googlec69e044fede13fdc.html (Google search console verification file)




How could I download, and extract all files from directory in tarball to override a my current MediaWiki directory to override all files besides listed exceptions (such as the files listed above)?
I do plan to backup the old directory before changes manually as priming part of the script; adding a condition to continue only of the backup was done might be a nice idea; all of this, aside from having automatic daily backups).










share|improve this question
















I have a MediaWiki 1.32.0 website which I want to upgrade, hosted on a CentOS "Shared Server" environment.

It is an all-core website with no added extensions, skins and images (besides logo)



To upgrade I need to change generally all files in the website's directory to those inside a directory of a newer version's MediaWiki installment (available inside a tartball) by a general overriding operation.



To download a latest MediaWiki tarball containing such directory (as of 13/08/19) one could execute:



wget https://releases.wikimedia.org/mediawiki/1.33/mediawiki-1.33.0.tar.gz


In my existing website directory, there are these files I already edited and shouldn't override:





  1. LocalSettings.php

  2. robots.txt

  3. .htaccess


  4. example.com.png (logo image)


  5. googlec69e044fede13fdc.html (Google search console verification file)




How could I download, and extract all files from directory in tarball to override a my current MediaWiki directory to override all files besides listed exceptions (such as the files listed above)?
I do plan to backup the old directory before changes manually as priming part of the script; adding a condition to continue only of the backup was done might be a nice idea; all of this, aside from having automatic daily backups).







tar mediawiki






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited 13 hours ago









terdon

140k34 gold badges287 silver badges466 bronze badges




140k34 gold badges287 silver badges466 bronze badges










asked Aug 12 at 17:19









JohnDoeaJohnDoea

71 gold badge11 silver badges46 bronze badges




71 gold badge11 silver badges46 bronze badges







This question has an open bounty worth +100
reputation from JohnDoea ending in 6 days.


This question has not received enough attention.











This question has an open bounty worth +100
reputation from JohnDoea ending in 6 days.


This question has not received enough attention.








This question has an open bounty worth +100
reputation from JohnDoea ending in 6 days.


This question has not received enough attention.















  • terdon, isn't lang=bash not working here? isn't stepped instructions be formatted by means of accessibility, splitting a passage into two sub-passages with <br> is wrong to you and I don't know why. I don't understand at least the formatting aspect of the rejection.

    – JohnDoea
    14 hours ago













  • Your edit was making the answer into a list (why?) and using the wrong format for lists (you want 1. not 1)); it added a grammatically incorrect phrase (In suggestion* see:) and then just changed the formatting commands that were fine already. Finally, you added a linebreak for no apparent reason. None of these changes were improving the post. But if you disagree with an edit rejection, please take it to meta. The comments are not the right place to discuss it.

    – terdon
    14 hours ago













  • IMO, steps should generally be in lists but never mind here; I was wrong about 1) as in this particular case 1. was fine indeed. I I don't know why you say this on "in suggestion" to me it's fine. I tried to explain why I think the <br> is good... Thanks anyway for taking my inquiry seriously --- I will not go to meta. I still ask about lang=bash though. Is it "valid" in SE?

    – JohnDoea
    13 hours ago













  • See Implement ```-style (fenced) Markdown code blocks. The right format is lang-bash not lang=bash but the principle is sound, yes.

    – terdon
    13 hours ago





















  • terdon, isn't lang=bash not working here? isn't stepped instructions be formatted by means of accessibility, splitting a passage into two sub-passages with <br> is wrong to you and I don't know why. I don't understand at least the formatting aspect of the rejection.

    – JohnDoea
    14 hours ago













  • Your edit was making the answer into a list (why?) and using the wrong format for lists (you want 1. not 1)); it added a grammatically incorrect phrase (In suggestion* see:) and then just changed the formatting commands that were fine already. Finally, you added a linebreak for no apparent reason. None of these changes were improving the post. But if you disagree with an edit rejection, please take it to meta. The comments are not the right place to discuss it.

    – terdon
    14 hours ago













  • IMO, steps should generally be in lists but never mind here; I was wrong about 1) as in this particular case 1. was fine indeed. I I don't know why you say this on "in suggestion" to me it's fine. I tried to explain why I think the <br> is good... Thanks anyway for taking my inquiry seriously --- I will not go to meta. I still ask about lang=bash though. Is it "valid" in SE?

    – JohnDoea
    13 hours ago













  • See Implement ```-style (fenced) Markdown code blocks. The right format is lang-bash not lang=bash but the principle is sound, yes.

    – terdon
    13 hours ago



















terdon, isn't lang=bash not working here? isn't stepped instructions be formatted by means of accessibility, splitting a passage into two sub-passages with <br> is wrong to you and I don't know why. I don't understand at least the formatting aspect of the rejection.

– JohnDoea
14 hours ago







terdon, isn't lang=bash not working here? isn't stepped instructions be formatted by means of accessibility, splitting a passage into two sub-passages with <br> is wrong to you and I don't know why. I don't understand at least the formatting aspect of the rejection.

– JohnDoea
14 hours ago















Your edit was making the answer into a list (why?) and using the wrong format for lists (you want 1. not 1)); it added a grammatically incorrect phrase (In suggestion* see:) and then just changed the formatting commands that were fine already. Finally, you added a linebreak for no apparent reason. None of these changes were improving the post. But if you disagree with an edit rejection, please take it to meta. The comments are not the right place to discuss it.

– terdon
14 hours ago







Your edit was making the answer into a list (why?) and using the wrong format for lists (you want 1. not 1)); it added a grammatically incorrect phrase (In suggestion* see:) and then just changed the formatting commands that were fine already. Finally, you added a linebreak for no apparent reason. None of these changes were improving the post. But if you disagree with an edit rejection, please take it to meta. The comments are not the right place to discuss it.

– terdon
14 hours ago















IMO, steps should generally be in lists but never mind here; I was wrong about 1) as in this particular case 1. was fine indeed. I I don't know why you say this on "in suggestion" to me it's fine. I tried to explain why I think the <br> is good... Thanks anyway for taking my inquiry seriously --- I will not go to meta. I still ask about lang=bash though. Is it "valid" in SE?

– JohnDoea
13 hours ago







IMO, steps should generally be in lists but never mind here; I was wrong about 1) as in this particular case 1. was fine indeed. I I don't know why you say this on "in suggestion" to me it's fine. I tried to explain why I think the <br> is good... Thanks anyway for taking my inquiry seriously --- I will not go to meta. I still ask about lang=bash though. Is it "valid" in SE?

– JohnDoea
13 hours ago















See Implement ```-style (fenced) Markdown code blocks. The right format is lang-bash not lang=bash but the principle is sound, yes.

– terdon
13 hours ago







See Implement ```-style (fenced) Markdown code blocks. The right format is lang-bash not lang=bash but the principle is sound, yes.

– terdon
13 hours ago












2 Answers
2






active

oldest

votes


















2














Create a file exclude.me



LocalSettings.php
robots.txt
.htaccess
example.com.png
googlec69e044fede13fdc.htm


Extract the archive



tar xvzf mediawiki-1.33.0.tar.gz --exclude-from=exclude.me --strip-components 1 -C /path/to/your/wiki


see https://unix.stackexchange.com/a/419400/364705 and https://stackoverflow.com/a/30259783



But if this is a production server with important data, I'd still suggest going the way terdon described.
And maybe just symlink the extracted archive to where your mediawiki is --the old-fashioned unix-way.






share|improve this answer


























  • Ah, nice! But does the --exclude-from expect a file with just file names or actual paths? And relative or absolute paths? What if there are multiple example.com.png files, for instance, and only one should be kept?

    – terdon
    14 hours ago













  • To be fair, only example.com.png was listed, without any wildcards or indication of multiple various names.

    – Jeff Schaller
    14 hours ago











  • @JeffSchaller oh yes, absolutely. I'm just thinking about a case where one of the file names to be kept exists in multiple places in the target directory structure and maybe only one needs to be kept. I'm not saying there's anything wrong with this answer, on the contrary, I've already upvoted it!

    – terdon
    13 hours ago











  • markgraf, why did you use --strip-components 1? Thanks anyway,

    – JohnDoea
    10 hours ago



















3














Extracting the tarball you have in your question will create the directory mediawiki-1.33.0 which contains the following sub-directories:



$ tree -dL 1 mediawiki-1.33.0
mediawiki-1.33.0
├── cache
├── docs
├── extensions
├── images
├── includes
├── languages
├── maintenance
├── mw-config
├── resources
├── skins
├── tests
└── vendor

12 directories


Assuming these are also the directories you need in a proper mediawiki installation, all you need to do is:





  1. Backup the files you want to keep, using -p to keep the permissions, ownership and timestamps unchanged.



    cp -p LocalSettings.php robots.txt .htaccess example.com.png googlec69e044fede13fdc.html /some/other/path



  2. Extract the tarball



    tar xvzf mediawiki-1.33.0.tar.gz



  3. Copy the files to wherever they should be



    cp -ra mediawiki-1.33.0/* /path/to/mediawiki/instrallation


    This will overwrite any existing files.




  4. Copy the backups back to their original locations



    cp -p /some/other/path/LocalSettings.php /original/path







share|improve this answer




























  • Hello dear terdon, please see my last edit; I think I misexplained what I want to do. TIA.

    – JohnDoea
    yesterday













  • @JohnDoea I don't understand what you mean. Isn't this exactly what you want to do?

    – terdon
    yesterday











  • Sadly it isn't - I desire not to backup and copy anything, just download and extract the new MediaWiki installment content directly into the existing website directory to override everything besides noted exceptions. And after that, delete the installment tarball.

    – JohnDoea
    yesterday













  • Edited again to try to better explain my intention. Sorry for doing it lousy...

    – JohnDoea
    yesterday






  • 2





    @JohnDoea yes, I understand that's what you wanted, but that's not a good approach. So I posted an answer explaining how simple it is to just copy the few files you need and then overwrite everything and copy them back. I don't know if it will be possible to tell tar to not overwrite specific files only, but if it is, it will be more complicated than this extremely simple approach. Maybe you could change the permissions on the files so you don't have write access to them but, again, that's more complicated. So this is the answer I would recommend.

    – terdon
    yesterday














Your Answer








StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "106"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f535217%2fdownload-extract-all-content-of-a-directory-in-tarball-into-an-existing-director%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























2 Answers
2






active

oldest

votes








2 Answers
2






active

oldest

votes









active

oldest

votes






active

oldest

votes









2














Create a file exclude.me



LocalSettings.php
robots.txt
.htaccess
example.com.png
googlec69e044fede13fdc.htm


Extract the archive



tar xvzf mediawiki-1.33.0.tar.gz --exclude-from=exclude.me --strip-components 1 -C /path/to/your/wiki


see https://unix.stackexchange.com/a/419400/364705 and https://stackoverflow.com/a/30259783



But if this is a production server with important data, I'd still suggest going the way terdon described.
And maybe just symlink the extracted archive to where your mediawiki is --the old-fashioned unix-way.






share|improve this answer


























  • Ah, nice! But does the --exclude-from expect a file with just file names or actual paths? And relative or absolute paths? What if there are multiple example.com.png files, for instance, and only one should be kept?

    – terdon
    14 hours ago













  • To be fair, only example.com.png was listed, without any wildcards or indication of multiple various names.

    – Jeff Schaller
    14 hours ago











  • @JeffSchaller oh yes, absolutely. I'm just thinking about a case where one of the file names to be kept exists in multiple places in the target directory structure and maybe only one needs to be kept. I'm not saying there's anything wrong with this answer, on the contrary, I've already upvoted it!

    – terdon
    13 hours ago











  • markgraf, why did you use --strip-components 1? Thanks anyway,

    – JohnDoea
    10 hours ago
















2














Create a file exclude.me



LocalSettings.php
robots.txt
.htaccess
example.com.png
googlec69e044fede13fdc.htm


Extract the archive



tar xvzf mediawiki-1.33.0.tar.gz --exclude-from=exclude.me --strip-components 1 -C /path/to/your/wiki


see https://unix.stackexchange.com/a/419400/364705 and https://stackoverflow.com/a/30259783



But if this is a production server with important data, I'd still suggest going the way terdon described.
And maybe just symlink the extracted archive to where your mediawiki is --the old-fashioned unix-way.






share|improve this answer


























  • Ah, nice! But does the --exclude-from expect a file with just file names or actual paths? And relative or absolute paths? What if there are multiple example.com.png files, for instance, and only one should be kept?

    – terdon
    14 hours ago













  • To be fair, only example.com.png was listed, without any wildcards or indication of multiple various names.

    – Jeff Schaller
    14 hours ago











  • @JeffSchaller oh yes, absolutely. I'm just thinking about a case where one of the file names to be kept exists in multiple places in the target directory structure and maybe only one needs to be kept. I'm not saying there's anything wrong with this answer, on the contrary, I've already upvoted it!

    – terdon
    13 hours ago











  • markgraf, why did you use --strip-components 1? Thanks anyway,

    – JohnDoea
    10 hours ago














2












2








2







Create a file exclude.me



LocalSettings.php
robots.txt
.htaccess
example.com.png
googlec69e044fede13fdc.htm


Extract the archive



tar xvzf mediawiki-1.33.0.tar.gz --exclude-from=exclude.me --strip-components 1 -C /path/to/your/wiki


see https://unix.stackexchange.com/a/419400/364705 and https://stackoverflow.com/a/30259783



But if this is a production server with important data, I'd still suggest going the way terdon described.
And maybe just symlink the extracted archive to where your mediawiki is --the old-fashioned unix-way.






share|improve this answer













Create a file exclude.me



LocalSettings.php
robots.txt
.htaccess
example.com.png
googlec69e044fede13fdc.htm


Extract the archive



tar xvzf mediawiki-1.33.0.tar.gz --exclude-from=exclude.me --strip-components 1 -C /path/to/your/wiki


see https://unix.stackexchange.com/a/419400/364705 and https://stackoverflow.com/a/30259783



But if this is a production server with important data, I'd still suggest going the way terdon described.
And maybe just symlink the extracted archive to where your mediawiki is --the old-fashioned unix-way.







share|improve this answer












share|improve this answer



share|improve this answer










answered 14 hours ago









markgrafmarkgraf

1876 bronze badges




1876 bronze badges
















  • Ah, nice! But does the --exclude-from expect a file with just file names or actual paths? And relative or absolute paths? What if there are multiple example.com.png files, for instance, and only one should be kept?

    – terdon
    14 hours ago













  • To be fair, only example.com.png was listed, without any wildcards or indication of multiple various names.

    – Jeff Schaller
    14 hours ago











  • @JeffSchaller oh yes, absolutely. I'm just thinking about a case where one of the file names to be kept exists in multiple places in the target directory structure and maybe only one needs to be kept. I'm not saying there's anything wrong with this answer, on the contrary, I've already upvoted it!

    – terdon
    13 hours ago











  • markgraf, why did you use --strip-components 1? Thanks anyway,

    – JohnDoea
    10 hours ago



















  • Ah, nice! But does the --exclude-from expect a file with just file names or actual paths? And relative or absolute paths? What if there are multiple example.com.png files, for instance, and only one should be kept?

    – terdon
    14 hours ago













  • To be fair, only example.com.png was listed, without any wildcards or indication of multiple various names.

    – Jeff Schaller
    14 hours ago











  • @JeffSchaller oh yes, absolutely. I'm just thinking about a case where one of the file names to be kept exists in multiple places in the target directory structure and maybe only one needs to be kept. I'm not saying there's anything wrong with this answer, on the contrary, I've already upvoted it!

    – terdon
    13 hours ago











  • markgraf, why did you use --strip-components 1? Thanks anyway,

    – JohnDoea
    10 hours ago

















Ah, nice! But does the --exclude-from expect a file with just file names or actual paths? And relative or absolute paths? What if there are multiple example.com.png files, for instance, and only one should be kept?

– terdon
14 hours ago







Ah, nice! But does the --exclude-from expect a file with just file names or actual paths? And relative or absolute paths? What if there are multiple example.com.png files, for instance, and only one should be kept?

– terdon
14 hours ago















To be fair, only example.com.png was listed, without any wildcards or indication of multiple various names.

– Jeff Schaller
14 hours ago





To be fair, only example.com.png was listed, without any wildcards or indication of multiple various names.

– Jeff Schaller
14 hours ago













@JeffSchaller oh yes, absolutely. I'm just thinking about a case where one of the file names to be kept exists in multiple places in the target directory structure and maybe only one needs to be kept. I'm not saying there's anything wrong with this answer, on the contrary, I've already upvoted it!

– terdon
13 hours ago





@JeffSchaller oh yes, absolutely. I'm just thinking about a case where one of the file names to be kept exists in multiple places in the target directory structure and maybe only one needs to be kept. I'm not saying there's anything wrong with this answer, on the contrary, I've already upvoted it!

– terdon
13 hours ago













markgraf, why did you use --strip-components 1? Thanks anyway,

– JohnDoea
10 hours ago





markgraf, why did you use --strip-components 1? Thanks anyway,

– JohnDoea
10 hours ago













3














Extracting the tarball you have in your question will create the directory mediawiki-1.33.0 which contains the following sub-directories:



$ tree -dL 1 mediawiki-1.33.0
mediawiki-1.33.0
├── cache
├── docs
├── extensions
├── images
├── includes
├── languages
├── maintenance
├── mw-config
├── resources
├── skins
├── tests
└── vendor

12 directories


Assuming these are also the directories you need in a proper mediawiki installation, all you need to do is:





  1. Backup the files you want to keep, using -p to keep the permissions, ownership and timestamps unchanged.



    cp -p LocalSettings.php robots.txt .htaccess example.com.png googlec69e044fede13fdc.html /some/other/path



  2. Extract the tarball



    tar xvzf mediawiki-1.33.0.tar.gz



  3. Copy the files to wherever they should be



    cp -ra mediawiki-1.33.0/* /path/to/mediawiki/instrallation


    This will overwrite any existing files.




  4. Copy the backups back to their original locations



    cp -p /some/other/path/LocalSettings.php /original/path







share|improve this answer




























  • Hello dear terdon, please see my last edit; I think I misexplained what I want to do. TIA.

    – JohnDoea
    yesterday













  • @JohnDoea I don't understand what you mean. Isn't this exactly what you want to do?

    – terdon
    yesterday











  • Sadly it isn't - I desire not to backup and copy anything, just download and extract the new MediaWiki installment content directly into the existing website directory to override everything besides noted exceptions. And after that, delete the installment tarball.

    – JohnDoea
    yesterday













  • Edited again to try to better explain my intention. Sorry for doing it lousy...

    – JohnDoea
    yesterday






  • 2





    @JohnDoea yes, I understand that's what you wanted, but that's not a good approach. So I posted an answer explaining how simple it is to just copy the few files you need and then overwrite everything and copy them back. I don't know if it will be possible to tell tar to not overwrite specific files only, but if it is, it will be more complicated than this extremely simple approach. Maybe you could change the permissions on the files so you don't have write access to them but, again, that's more complicated. So this is the answer I would recommend.

    – terdon
    yesterday
















3














Extracting the tarball you have in your question will create the directory mediawiki-1.33.0 which contains the following sub-directories:



$ tree -dL 1 mediawiki-1.33.0
mediawiki-1.33.0
├── cache
├── docs
├── extensions
├── images
├── includes
├── languages
├── maintenance
├── mw-config
├── resources
├── skins
├── tests
└── vendor

12 directories


Assuming these are also the directories you need in a proper mediawiki installation, all you need to do is:





  1. Backup the files you want to keep, using -p to keep the permissions, ownership and timestamps unchanged.



    cp -p LocalSettings.php robots.txt .htaccess example.com.png googlec69e044fede13fdc.html /some/other/path



  2. Extract the tarball



    tar xvzf mediawiki-1.33.0.tar.gz



  3. Copy the files to wherever they should be



    cp -ra mediawiki-1.33.0/* /path/to/mediawiki/instrallation


    This will overwrite any existing files.




  4. Copy the backups back to their original locations



    cp -p /some/other/path/LocalSettings.php /original/path







share|improve this answer




























  • Hello dear terdon, please see my last edit; I think I misexplained what I want to do. TIA.

    – JohnDoea
    yesterday













  • @JohnDoea I don't understand what you mean. Isn't this exactly what you want to do?

    – terdon
    yesterday











  • Sadly it isn't - I desire not to backup and copy anything, just download and extract the new MediaWiki installment content directly into the existing website directory to override everything besides noted exceptions. And after that, delete the installment tarball.

    – JohnDoea
    yesterday













  • Edited again to try to better explain my intention. Sorry for doing it lousy...

    – JohnDoea
    yesterday






  • 2





    @JohnDoea yes, I understand that's what you wanted, but that's not a good approach. So I posted an answer explaining how simple it is to just copy the few files you need and then overwrite everything and copy them back. I don't know if it will be possible to tell tar to not overwrite specific files only, but if it is, it will be more complicated than this extremely simple approach. Maybe you could change the permissions on the files so you don't have write access to them but, again, that's more complicated. So this is the answer I would recommend.

    – terdon
    yesterday














3












3








3







Extracting the tarball you have in your question will create the directory mediawiki-1.33.0 which contains the following sub-directories:



$ tree -dL 1 mediawiki-1.33.0
mediawiki-1.33.0
├── cache
├── docs
├── extensions
├── images
├── includes
├── languages
├── maintenance
├── mw-config
├── resources
├── skins
├── tests
└── vendor

12 directories


Assuming these are also the directories you need in a proper mediawiki installation, all you need to do is:





  1. Backup the files you want to keep, using -p to keep the permissions, ownership and timestamps unchanged.



    cp -p LocalSettings.php robots.txt .htaccess example.com.png googlec69e044fede13fdc.html /some/other/path



  2. Extract the tarball



    tar xvzf mediawiki-1.33.0.tar.gz



  3. Copy the files to wherever they should be



    cp -ra mediawiki-1.33.0/* /path/to/mediawiki/instrallation


    This will overwrite any existing files.




  4. Copy the backups back to their original locations



    cp -p /some/other/path/LocalSettings.php /original/path







share|improve this answer















Extracting the tarball you have in your question will create the directory mediawiki-1.33.0 which contains the following sub-directories:



$ tree -dL 1 mediawiki-1.33.0
mediawiki-1.33.0
├── cache
├── docs
├── extensions
├── images
├── includes
├── languages
├── maintenance
├── mw-config
├── resources
├── skins
├── tests
└── vendor

12 directories


Assuming these are also the directories you need in a proper mediawiki installation, all you need to do is:





  1. Backup the files you want to keep, using -p to keep the permissions, ownership and timestamps unchanged.



    cp -p LocalSettings.php robots.txt .htaccess example.com.png googlec69e044fede13fdc.html /some/other/path



  2. Extract the tarball



    tar xvzf mediawiki-1.33.0.tar.gz



  3. Copy the files to wherever they should be



    cp -ra mediawiki-1.33.0/* /path/to/mediawiki/instrallation


    This will overwrite any existing files.




  4. Copy the backups back to their original locations



    cp -p /some/other/path/LocalSettings.php /original/path








share|improve this answer














share|improve this answer



share|improve this answer








edited 13 hours ago

























answered Aug 12 at 18:09









terdonterdon

140k34 gold badges287 silver badges466 bronze badges




140k34 gold badges287 silver badges466 bronze badges
















  • Hello dear terdon, please see my last edit; I think I misexplained what I want to do. TIA.

    – JohnDoea
    yesterday













  • @JohnDoea I don't understand what you mean. Isn't this exactly what you want to do?

    – terdon
    yesterday











  • Sadly it isn't - I desire not to backup and copy anything, just download and extract the new MediaWiki installment content directly into the existing website directory to override everything besides noted exceptions. And after that, delete the installment tarball.

    – JohnDoea
    yesterday













  • Edited again to try to better explain my intention. Sorry for doing it lousy...

    – JohnDoea
    yesterday






  • 2





    @JohnDoea yes, I understand that's what you wanted, but that's not a good approach. So I posted an answer explaining how simple it is to just copy the few files you need and then overwrite everything and copy them back. I don't know if it will be possible to tell tar to not overwrite specific files only, but if it is, it will be more complicated than this extremely simple approach. Maybe you could change the permissions on the files so you don't have write access to them but, again, that's more complicated. So this is the answer I would recommend.

    – terdon
    yesterday



















  • Hello dear terdon, please see my last edit; I think I misexplained what I want to do. TIA.

    – JohnDoea
    yesterday













  • @JohnDoea I don't understand what you mean. Isn't this exactly what you want to do?

    – terdon
    yesterday











  • Sadly it isn't - I desire not to backup and copy anything, just download and extract the new MediaWiki installment content directly into the existing website directory to override everything besides noted exceptions. And after that, delete the installment tarball.

    – JohnDoea
    yesterday













  • Edited again to try to better explain my intention. Sorry for doing it lousy...

    – JohnDoea
    yesterday






  • 2





    @JohnDoea yes, I understand that's what you wanted, but that's not a good approach. So I posted an answer explaining how simple it is to just copy the few files you need and then overwrite everything and copy them back. I don't know if it will be possible to tell tar to not overwrite specific files only, but if it is, it will be more complicated than this extremely simple approach. Maybe you could change the permissions on the files so you don't have write access to them but, again, that's more complicated. So this is the answer I would recommend.

    – terdon
    yesterday

















Hello dear terdon, please see my last edit; I think I misexplained what I want to do. TIA.

– JohnDoea
yesterday







Hello dear terdon, please see my last edit; I think I misexplained what I want to do. TIA.

– JohnDoea
yesterday















@JohnDoea I don't understand what you mean. Isn't this exactly what you want to do?

– terdon
yesterday





@JohnDoea I don't understand what you mean. Isn't this exactly what you want to do?

– terdon
yesterday













Sadly it isn't - I desire not to backup and copy anything, just download and extract the new MediaWiki installment content directly into the existing website directory to override everything besides noted exceptions. And after that, delete the installment tarball.

– JohnDoea
yesterday







Sadly it isn't - I desire not to backup and copy anything, just download and extract the new MediaWiki installment content directly into the existing website directory to override everything besides noted exceptions. And after that, delete the installment tarball.

– JohnDoea
yesterday















Edited again to try to better explain my intention. Sorry for doing it lousy...

– JohnDoea
yesterday





Edited again to try to better explain my intention. Sorry for doing it lousy...

– JohnDoea
yesterday




2




2





@JohnDoea yes, I understand that's what you wanted, but that's not a good approach. So I posted an answer explaining how simple it is to just copy the few files you need and then overwrite everything and copy them back. I don't know if it will be possible to tell tar to not overwrite specific files only, but if it is, it will be more complicated than this extremely simple approach. Maybe you could change the permissions on the files so you don't have write access to them but, again, that's more complicated. So this is the answer I would recommend.

– terdon
yesterday





@JohnDoea yes, I understand that's what you wanted, but that's not a good approach. So I posted an answer explaining how simple it is to just copy the few files you need and then overwrite everything and copy them back. I don't know if it will be possible to tell tar to not overwrite specific files only, but if it is, it will be more complicated than this extremely simple approach. Maybe you could change the permissions on the files so you don't have write access to them but, again, that's more complicated. So this is the answer I would recommend.

– terdon
yesterday


















draft saved

draft discarded




















































Thanks for contributing an answer to Unix & Linux Stack Exchange!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f535217%2fdownload-extract-all-content-of-a-directory-in-tarball-into-an-existing-director%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

Taj Mahal Inhaltsverzeichnis Aufbau | Geschichte | 350-Jahr-Feier | Heutige Bedeutung | Siehe auch |...

Baia Sprie Cuprins Etimologie | Istorie | Demografie | Politică și administrație | Arii naturale...

Nicolae Petrescu-Găină Cuprins Biografie | Opera | In memoriam | Varia | Controverse, incertitudini...