Is there any way to limit the fdupes search space?Is there a way of deleting duplicates more refined than...

Where does the expression "triple-A" comes from?

How can "life" insurance prevent the cheapening of death?

What does my colleagues' question really mean?

Is there a "right" way to interpret a novel? If so, how do we make sure our novel is interpreted correctly?

Job offer without any details but asking me to withdraw other applications - is it normal?

Do any aircraft carry boats?

I see your BIDMAS and raise you a BADMIS

What is negative current?

Do Milankovitch Cycles fully explain climate change?

Starring Samurais - Several Scribbled Short Stories

Random point on a sphere

Calculate time difference between two dates

My favorite color is blue what is your favorite color?

Is it appropriate for a professor to require students to sign a non-disclosure agreement before being taught?

How is the Team Scooby Doo funded?

CBP interview, how serious should I take it?

Is there a star over my head?

2.5 year old daughter refuses to take medicine

Was Robin Hood's point of view ethically sound?

"Not enough RAM " error in PIC16F877a

How much power do LED smart bulb wireless control systems consume when the light is turned off?

Why would "an mule" be used instead of "a mule"?

Why is there a が in 深淵に臨むが如し?

Will replacing a fake visa with a different fake visa cause me problems when applying for a legal study permit?



Is there any way to limit the fdupes search space?


Is there a way of deleting duplicates more refined than fdupes -rdN?Faster way to rename duplicate files (identified by fdupes) in another directory?






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{ margin-bottom:0;
}







1















I have a pair of disks, D1 and D2.



I want to determine if all files in D2 have a corresponding copy somewhere in D1.



D1 contains approximately 4000 times as many files as D2.



fdupes -r D1 D2 searches for all duplicates anywhere within D1 or D2, which requires doing a tremendous amount of computation across all the files in D1.



Is there a way to direct fdupes to only search for duplicates of files in D2 that exist in D1?










share|improve this question







New contributor



jtb is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.




























    1















    I have a pair of disks, D1 and D2.



    I want to determine if all files in D2 have a corresponding copy somewhere in D1.



    D1 contains approximately 4000 times as many files as D2.



    fdupes -r D1 D2 searches for all duplicates anywhere within D1 or D2, which requires doing a tremendous amount of computation across all the files in D1.



    Is there a way to direct fdupes to only search for duplicates of files in D2 that exist in D1?










    share|improve this question







    New contributor



    jtb is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.
























      1












      1








      1








      I have a pair of disks, D1 and D2.



      I want to determine if all files in D2 have a corresponding copy somewhere in D1.



      D1 contains approximately 4000 times as many files as D2.



      fdupes -r D1 D2 searches for all duplicates anywhere within D1 or D2, which requires doing a tremendous amount of computation across all the files in D1.



      Is there a way to direct fdupes to only search for duplicates of files in D2 that exist in D1?










      share|improve this question







      New contributor



      jtb is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.











      I have a pair of disks, D1 and D2.



      I want to determine if all files in D2 have a corresponding copy somewhere in D1.



      D1 contains approximately 4000 times as many files as D2.



      fdupes -r D1 D2 searches for all duplicates anywhere within D1 or D2, which requires doing a tremendous amount of computation across all the files in D1.



      Is there a way to direct fdupes to only search for duplicates of files in D2 that exist in D1?







      fdupes






      share|improve this question







      New contributor



      jtb is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.










      share|improve this question







      New contributor



      jtb is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.








      share|improve this question




      share|improve this question






      New contributor



      jtb is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.








      asked 1 hour ago









      jtbjtb

      61 bronze badge




      61 bronze badge




      New contributor



      jtb is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.




      New contributor




      jtb is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.



























          0






          active

          oldest

          votes














          Your Answer








          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "106"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: false,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: null,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/4.0/"u003ecc by-sa 4.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });







          jtb is a new contributor. Be nice, and check out our Code of Conduct.










          draft saved

          draft discarded
















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f539930%2fis-there-any-way-to-limit-the-fdupes-search-space%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          0






          active

          oldest

          votes








          0






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          jtb is a new contributor. Be nice, and check out our Code of Conduct.










          draft saved

          draft discarded

















          jtb is a new contributor. Be nice, and check out our Code of Conduct.













          jtb is a new contributor. Be nice, and check out our Code of Conduct.












          jtb is a new contributor. Be nice, and check out our Code of Conduct.
















          Thanks for contributing an answer to Unix & Linux Stack Exchange!


          • Please be sure to answer the question. Provide details and share your research!

          But avoid



          • Asking for help, clarification, or responding to other answers.

          • Making statements based on opinion; back them up with references or personal experience.


          To learn more, see our tips on writing great answers.




          draft saved


          draft discarded














          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f539930%2fis-there-any-way-to-limit-the-fdupes-search-space%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown





















































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown

































          Required, but never shown














          Required, but never shown












          Required, but never shown







          Required, but never shown







          Popular posts from this blog

          Taj Mahal Inhaltsverzeichnis Aufbau | Geschichte | 350-Jahr-Feier | Heutige Bedeutung | Siehe auch |...

          Baia Sprie Cuprins Etimologie | Istorie | Demografie | Politică și administrație | Arii naturale...

          Nicolae Petrescu-Găină Cuprins Biografie | Opera | In memoriam | Varia | Controverse, incertitudini...