Copy files to aws s3 bucket using AnsibleRunning the Following Playbook syntax appears to be correct but...

Why is CMYK & PNG not possible?

How do I weigh a kitchen island to determine what size castors to get?

Does Windows 10 Fast Startup feature drain battery while laptop is turned off?

Print the sequence

A Grandma Riddle

Is data science mathematically interesting?

Is Having my Players Control Two Parties a Good Idea?

A transformation/regularization to discretize a range of values non-uniformly

Slow coworker receiving compliments while I receive complaints

An idiomatic word for "very little" in this context?

Drawing Super Mario Bros.....in LaTeX

How can I add just the second elements in lists of pairs?

Does the Wall of Stone spell need support or not?

UK PM is taking his proposal to EU but has not proposed to his own parliament - can he legally bypass the UK parliament?

Short story about aliens who tried using the common cold as a weapon

Disrespectful employee going above my head and telling me what to do. I am his manager

Is consistent disregard for students' time "normal" in undergraduate research?

Are there any rules around when something can be described as "based on a true story"?

Variable fixing based on a good feasible solution

How to handle shared mortgage payment if one person can't pay their share?

Can I use both 気温 and 温度 when asking for the weather temperature?

Big Bracket for equations

Using "sed" to append to end of a file

Can you decide not to sneak into a room after seeing your roll?



Copy files to aws s3 bucket using Ansible


Running the Following Playbook syntax appears to be correct but getting following ERROR!- 'blockinfile' is not a valid attribute for a PlayAnsible - Configuring AWS EC2 instance after it's createdAnsible : fetching remote hostnamesAnsible shell module problemansible block of codeAnsible configuration fileExecute command using Ansible






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty{
margin-bottom:0;
}








1

















My plan is to copy file from ec2 to s3 bucket using ansible, Here I've make playbook but getting some error:



copy2s3.yml



---
- name: Copy to s3
s3:
aws_access_key: "{{ lookup('env','aws_key') }}"
aws_secret_key: "{{ lookup('env','aws_secret') }}"
bucket: "{{ aws_packages_bucket }}"
object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
mode: get
overwrite: no


Getting below error:



$ ansible-playbook copy2s3.yml -i 172.18.2.12,

ERROR! 's3' is not a valid attribute for a Play

The error appears to have been in '/home/ubuntu/bk/copy2s3.yml': line 2, column 3, but may
be elsewhere in the file depending on the exact syntax problem.

The offending line appears to be:

---
- name: Copy to s3
^ here









share|improve this question

































    1

















    My plan is to copy file from ec2 to s3 bucket using ansible, Here I've make playbook but getting some error:



    copy2s3.yml



    ---
    - name: Copy to s3
    s3:
    aws_access_key: "{{ lookup('env','aws_key') }}"
    aws_secret_key: "{{ lookup('env','aws_secret') }}"
    bucket: "{{ aws_packages_bucket }}"
    object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
    dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
    mode: get
    overwrite: no


    Getting below error:



    $ ansible-playbook copy2s3.yml -i 172.18.2.12,

    ERROR! 's3' is not a valid attribute for a Play

    The error appears to have been in '/home/ubuntu/bk/copy2s3.yml': line 2, column 3, but may
    be elsewhere in the file depending on the exact syntax problem.

    The offending line appears to be:

    ---
    - name: Copy to s3
    ^ here









    share|improve this question





























      1












      1








      1








      My plan is to copy file from ec2 to s3 bucket using ansible, Here I've make playbook but getting some error:



      copy2s3.yml



      ---
      - name: Copy to s3
      s3:
      aws_access_key: "{{ lookup('env','aws_key') }}"
      aws_secret_key: "{{ lookup('env','aws_secret') }}"
      bucket: "{{ aws_packages_bucket }}"
      object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
      dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
      mode: get
      overwrite: no


      Getting below error:



      $ ansible-playbook copy2s3.yml -i 172.18.2.12,

      ERROR! 's3' is not a valid attribute for a Play

      The error appears to have been in '/home/ubuntu/bk/copy2s3.yml': line 2, column 3, but may
      be elsewhere in the file depending on the exact syntax problem.

      The offending line appears to be:

      ---
      - name: Copy to s3
      ^ here









      share|improve this question














      My plan is to copy file from ec2 to s3 bucket using ansible, Here I've make playbook but getting some error:



      copy2s3.yml



      ---
      - name: Copy to s3
      s3:
      aws_access_key: "{{ lookup('env','aws_key') }}"
      aws_secret_key: "{{ lookup('env','aws_secret') }}"
      bucket: "{{ aws_packages_bucket }}"
      object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
      dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
      mode: get
      overwrite: no


      Getting below error:



      $ ansible-playbook copy2s3.yml -i 172.18.2.12,

      ERROR! 's3' is not a valid attribute for a Play

      The error appears to have been in '/home/ubuntu/bk/copy2s3.yml': line 2, column 3, but may
      be elsewhere in the file depending on the exact syntax problem.

      The offending line appears to be:

      ---
      - name: Copy to s3
      ^ here






      aws ansible






      share|improve this question













      share|improve this question











      share|improve this question




      share|improve this question



      share|improve this question










      asked Apr 3 '17 at 13:27









      NullpointerNullpointer

      2731 gold badge4 silver badges18 bronze badges




      2731 gold badge4 silver badges18 bronze badges

























          5 Answers
          5






          active

          oldest

          votes


















          2


















          Module name (s3) should be at the same indentation level as the name:



          - name: Copy to s3
          s3:
          aws_access_key: "{{ lookup('env','aws_key') }}"
          aws_secret_key: "{{ lookup('env','aws_secret') }}"
          bucket: "{{ aws_packages_bucket }}"
          object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
          dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
          mode: get
          overwrite: no





          share|improve this answer



























          • I use same but still getting same error, showing YAML syntax ok !, Is need to install s3 plugins in ansible?

            – Nullpointer
            Apr 4 '17 at 4:33













          • If syntax is ok, then, what other error are you getting? S3 module should be available by default. Also, try updating your Ansible and boto libraries: pip install -U ansible boto.

            – Marko Živanović
            Apr 4 '17 at 9:20











          • I resolve issue with above but need to write hosts, tasks, connection and gather_facts above on script then its works

            – Nullpointer
            Apr 4 '17 at 9:25













          • Yes, the above is just a single task, if you want to run it as a playbook, you'll need the parameters you described.

            – Marko Živanović
            Apr 4 '17 at 9:29



















          0


















          To Copy Object from Local Server to S3 using Ansible module, Use



          mode: put


          get will be used to download the object.



          Reference






          share|improve this answer




































            0


















            I had a similar issue when using aws_s3, the replacement module for s3.



            Check to see if you have boto (for s3 and aws_s3) and boto3 (for aws_s3) correctly installed.



            I had both boto & boto3 installed but, due to playing with virtual environments, they were only installed for Python3.5 and no other versions of python. Thus python (Python2.7 on my setup) that Ansible uses could not import the boto's and failed with this very esoteric error message.



            To ensure that all is correctly installed is run python on the command line and try to import the boto's manually:



            13:20 $ python
            Python 2.7.12 (default, Nov 19 2016, 06:48:10)
            [GCC 5.4.0 20160609] on linux2
            Type "help", "copyright", "credits" or "license" for more information.
            >>> import boto
            >>> import boto3
            >>>
            13:21 $ python3
            Python 3.5.2 (default, Sep 14 2017, 22:51:06)
            [GCC 5.4.0 20160609] on linux
            Type "help", "copyright", "credits" or "license" for more information.
            >>> import boto
            >>> import boto3
            >>>


            If you receive an error in python then you get the error in Ansible.






            share|improve this answer


































              0


















              Add task: above - name like this:



              ---
              - hosts: localhost
              tasks:
              - name: Copy to s3
              s3:
              aws_access_key: "{{ lookup('env','aws_key') }}"
              aws_secret_key: "{{ lookup('env','aws_secret') }}"
              bucket: "{{ aws_packages_bucket }}"
              object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
              dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
              mode: put
              overwrite: no





              share|improve this answer




































                0


















                this source maybe helps you



                ---
                - name: Bucket copy
                hosts: localhost
                become_method: sudo
                become_user: root
                become: true
                gather_facts: False
                vars_files:
                - varlist.yml
                tasks:
                - name: Get s3 objects # Make list of directory and files in register
                aws_s3:
                bucket: "{{ Bucket_name }}"
                mode: list
                aws_access_key: "{{ aws_access_key }}"
                aws_secret_key: "{{ aws_secret_key }}"
                register: s3_object_list
                - name: Create download directory # Create directory for download latest code on s3 bucket
                file:
                path: "S3/{{ item }}"
                state: directory
                with_items:
                - "{{ s3_object_list.s3_keys }}"
                ignore_errors: true
                - name: Download s3 objects # Download files in there appropriate directory on serverside
                aws_s3:
                bucket: "{{ Bucket_name }}"
                object: "{{ item }}"
                mode: get
                dest: "S3/{{ item }}"
                aws_access_key: "{{ aws_access_key }}"
                aws_secret_key: "{{ aws_secret_key }}"
                with_items:
                - "{{ s3_object_list.s3_keys }}"
                ignore_errors: true
                - name: Folder permissions
                file:
                path: S3/*
                state: touch
                mode: "u=rw,g=r,o=r"





                share|improve this answer









                New contributor



                Shaha Nawaj Mulla is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                Check out our Code of Conduct.























                  Your Answer








                  StackExchange.ready(function() {
                  var channelOptions = {
                  tags: "".split(" "),
                  id: "106"
                  };
                  initTagRenderer("".split(" "), "".split(" "), channelOptions);

                  StackExchange.using("externalEditor", function() {
                  // Have to fire editor after snippets, if snippets enabled
                  if (StackExchange.settings.snippets.snippetsEnabled) {
                  StackExchange.using("snippets", function() {
                  createEditor();
                  });
                  }
                  else {
                  createEditor();
                  }
                  });

                  function createEditor() {
                  StackExchange.prepareEditor({
                  heartbeatType: 'answer',
                  autoActivateHeartbeat: false,
                  convertImagesToLinks: false,
                  noModals: true,
                  showLowRepImageUploadWarning: true,
                  reputationToPostImages: null,
                  bindNavPrevention: true,
                  postfix: "",
                  imageUploader: {
                  brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
                  contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/4.0/"u003ecc by-sa 4.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
                  allowUrls: true
                  },
                  onDemand: true,
                  discardSelector: ".discard-answer"
                  ,immediatelyShowMarkdownHelp:true
                  });


                  }
                  });















                  draft saved

                  draft discarded
















                  StackExchange.ready(
                  function () {
                  StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f355613%2fcopy-files-to-aws-s3-bucket-using-ansible%23new-answer', 'question_page');
                  }
                  );

                  Post as a guest















                  Required, but never shown


























                  5 Answers
                  5






                  active

                  oldest

                  votes








                  5 Answers
                  5






                  active

                  oldest

                  votes









                  active

                  oldest

                  votes






                  active

                  oldest

                  votes









                  2


















                  Module name (s3) should be at the same indentation level as the name:



                  - name: Copy to s3
                  s3:
                  aws_access_key: "{{ lookup('env','aws_key') }}"
                  aws_secret_key: "{{ lookup('env','aws_secret') }}"
                  bucket: "{{ aws_packages_bucket }}"
                  object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
                  dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
                  mode: get
                  overwrite: no





                  share|improve this answer



























                  • I use same but still getting same error, showing YAML syntax ok !, Is need to install s3 plugins in ansible?

                    – Nullpointer
                    Apr 4 '17 at 4:33













                  • If syntax is ok, then, what other error are you getting? S3 module should be available by default. Also, try updating your Ansible and boto libraries: pip install -U ansible boto.

                    – Marko Živanović
                    Apr 4 '17 at 9:20











                  • I resolve issue with above but need to write hosts, tasks, connection and gather_facts above on script then its works

                    – Nullpointer
                    Apr 4 '17 at 9:25













                  • Yes, the above is just a single task, if you want to run it as a playbook, you'll need the parameters you described.

                    – Marko Živanović
                    Apr 4 '17 at 9:29
















                  2


















                  Module name (s3) should be at the same indentation level as the name:



                  - name: Copy to s3
                  s3:
                  aws_access_key: "{{ lookup('env','aws_key') }}"
                  aws_secret_key: "{{ lookup('env','aws_secret') }}"
                  bucket: "{{ aws_packages_bucket }}"
                  object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
                  dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
                  mode: get
                  overwrite: no





                  share|improve this answer



























                  • I use same but still getting same error, showing YAML syntax ok !, Is need to install s3 plugins in ansible?

                    – Nullpointer
                    Apr 4 '17 at 4:33













                  • If syntax is ok, then, what other error are you getting? S3 module should be available by default. Also, try updating your Ansible and boto libraries: pip install -U ansible boto.

                    – Marko Živanović
                    Apr 4 '17 at 9:20











                  • I resolve issue with above but need to write hosts, tasks, connection and gather_facts above on script then its works

                    – Nullpointer
                    Apr 4 '17 at 9:25













                  • Yes, the above is just a single task, if you want to run it as a playbook, you'll need the parameters you described.

                    – Marko Živanović
                    Apr 4 '17 at 9:29














                  2














                  2










                  2









                  Module name (s3) should be at the same indentation level as the name:



                  - name: Copy to s3
                  s3:
                  aws_access_key: "{{ lookup('env','aws_key') }}"
                  aws_secret_key: "{{ lookup('env','aws_secret') }}"
                  bucket: "{{ aws_packages_bucket }}"
                  object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
                  dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
                  mode: get
                  overwrite: no





                  share|improve this answer














                  Module name (s3) should be at the same indentation level as the name:



                  - name: Copy to s3
                  s3:
                  aws_access_key: "{{ lookup('env','aws_key') }}"
                  aws_secret_key: "{{ lookup('env','aws_secret') }}"
                  bucket: "{{ aws_packages_bucket }}"
                  object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
                  dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
                  mode: get
                  overwrite: no






                  share|improve this answer













                  share|improve this answer




                  share|improve this answer



                  share|improve this answer










                  answered Apr 3 '17 at 14:20









                  Marko ŽivanovićMarko Živanović

                  2011 silver badge4 bronze badges




                  2011 silver badge4 bronze badges
















                  • I use same but still getting same error, showing YAML syntax ok !, Is need to install s3 plugins in ansible?

                    – Nullpointer
                    Apr 4 '17 at 4:33













                  • If syntax is ok, then, what other error are you getting? S3 module should be available by default. Also, try updating your Ansible and boto libraries: pip install -U ansible boto.

                    – Marko Živanović
                    Apr 4 '17 at 9:20











                  • I resolve issue with above but need to write hosts, tasks, connection and gather_facts above on script then its works

                    – Nullpointer
                    Apr 4 '17 at 9:25













                  • Yes, the above is just a single task, if you want to run it as a playbook, you'll need the parameters you described.

                    – Marko Živanović
                    Apr 4 '17 at 9:29



















                  • I use same but still getting same error, showing YAML syntax ok !, Is need to install s3 plugins in ansible?

                    – Nullpointer
                    Apr 4 '17 at 4:33













                  • If syntax is ok, then, what other error are you getting? S3 module should be available by default. Also, try updating your Ansible and boto libraries: pip install -U ansible boto.

                    – Marko Živanović
                    Apr 4 '17 at 9:20











                  • I resolve issue with above but need to write hosts, tasks, connection and gather_facts above on script then its works

                    – Nullpointer
                    Apr 4 '17 at 9:25













                  • Yes, the above is just a single task, if you want to run it as a playbook, you'll need the parameters you described.

                    – Marko Živanović
                    Apr 4 '17 at 9:29

















                  I use same but still getting same error, showing YAML syntax ok !, Is need to install s3 plugins in ansible?

                  – Nullpointer
                  Apr 4 '17 at 4:33







                  I use same but still getting same error, showing YAML syntax ok !, Is need to install s3 plugins in ansible?

                  – Nullpointer
                  Apr 4 '17 at 4:33















                  If syntax is ok, then, what other error are you getting? S3 module should be available by default. Also, try updating your Ansible and boto libraries: pip install -U ansible boto.

                  – Marko Živanović
                  Apr 4 '17 at 9:20





                  If syntax is ok, then, what other error are you getting? S3 module should be available by default. Also, try updating your Ansible and boto libraries: pip install -U ansible boto.

                  – Marko Živanović
                  Apr 4 '17 at 9:20













                  I resolve issue with above but need to write hosts, tasks, connection and gather_facts above on script then its works

                  – Nullpointer
                  Apr 4 '17 at 9:25







                  I resolve issue with above but need to write hosts, tasks, connection and gather_facts above on script then its works

                  – Nullpointer
                  Apr 4 '17 at 9:25















                  Yes, the above is just a single task, if you want to run it as a playbook, you'll need the parameters you described.

                  – Marko Živanović
                  Apr 4 '17 at 9:29





                  Yes, the above is just a single task, if you want to run it as a playbook, you'll need the parameters you described.

                  – Marko Živanović
                  Apr 4 '17 at 9:29













                  0


















                  To Copy Object from Local Server to S3 using Ansible module, Use



                  mode: put


                  get will be used to download the object.



                  Reference






                  share|improve this answer

































                    0


















                    To Copy Object from Local Server to S3 using Ansible module, Use



                    mode: put


                    get will be used to download the object.



                    Reference






                    share|improve this answer































                      0














                      0










                      0









                      To Copy Object from Local Server to S3 using Ansible module, Use



                      mode: put


                      get will be used to download the object.



                      Reference






                      share|improve this answer
















                      To Copy Object from Local Server to S3 using Ansible module, Use



                      mode: put


                      get will be used to download the object.



                      Reference







                      share|improve this answer















                      share|improve this answer




                      share|improve this answer



                      share|improve this answer








                      edited Aug 11 '17 at 13:00









                      Stephen Rauch

                      3,41610 gold badges16 silver badges30 bronze badges




                      3,41610 gold badges16 silver badges30 bronze badges










                      answered Aug 11 '17 at 12:40









                      Naveen KumarNaveen Kumar

                      11 bronze badge




                      11 bronze badge


























                          0


















                          I had a similar issue when using aws_s3, the replacement module for s3.



                          Check to see if you have boto (for s3 and aws_s3) and boto3 (for aws_s3) correctly installed.



                          I had both boto & boto3 installed but, due to playing with virtual environments, they were only installed for Python3.5 and no other versions of python. Thus python (Python2.7 on my setup) that Ansible uses could not import the boto's and failed with this very esoteric error message.



                          To ensure that all is correctly installed is run python on the command line and try to import the boto's manually:



                          13:20 $ python
                          Python 2.7.12 (default, Nov 19 2016, 06:48:10)
                          [GCC 5.4.0 20160609] on linux2
                          Type "help", "copyright", "credits" or "license" for more information.
                          >>> import boto
                          >>> import boto3
                          >>>
                          13:21 $ python3
                          Python 3.5.2 (default, Sep 14 2017, 22:51:06)
                          [GCC 5.4.0 20160609] on linux
                          Type "help", "copyright", "credits" or "license" for more information.
                          >>> import boto
                          >>> import boto3
                          >>>


                          If you receive an error in python then you get the error in Ansible.






                          share|improve this answer































                            0


















                            I had a similar issue when using aws_s3, the replacement module for s3.



                            Check to see if you have boto (for s3 and aws_s3) and boto3 (for aws_s3) correctly installed.



                            I had both boto & boto3 installed but, due to playing with virtual environments, they were only installed for Python3.5 and no other versions of python. Thus python (Python2.7 on my setup) that Ansible uses could not import the boto's and failed with this very esoteric error message.



                            To ensure that all is correctly installed is run python on the command line and try to import the boto's manually:



                            13:20 $ python
                            Python 2.7.12 (default, Nov 19 2016, 06:48:10)
                            [GCC 5.4.0 20160609] on linux2
                            Type "help", "copyright", "credits" or "license" for more information.
                            >>> import boto
                            >>> import boto3
                            >>>
                            13:21 $ python3
                            Python 3.5.2 (default, Sep 14 2017, 22:51:06)
                            [GCC 5.4.0 20160609] on linux
                            Type "help", "copyright", "credits" or "license" for more information.
                            >>> import boto
                            >>> import boto3
                            >>>


                            If you receive an error in python then you get the error in Ansible.






                            share|improve this answer





























                              0














                              0










                              0









                              I had a similar issue when using aws_s3, the replacement module for s3.



                              Check to see if you have boto (for s3 and aws_s3) and boto3 (for aws_s3) correctly installed.



                              I had both boto & boto3 installed but, due to playing with virtual environments, they were only installed for Python3.5 and no other versions of python. Thus python (Python2.7 on my setup) that Ansible uses could not import the boto's and failed with this very esoteric error message.



                              To ensure that all is correctly installed is run python on the command line and try to import the boto's manually:



                              13:20 $ python
                              Python 2.7.12 (default, Nov 19 2016, 06:48:10)
                              [GCC 5.4.0 20160609] on linux2
                              Type "help", "copyright", "credits" or "license" for more information.
                              >>> import boto
                              >>> import boto3
                              >>>
                              13:21 $ python3
                              Python 3.5.2 (default, Sep 14 2017, 22:51:06)
                              [GCC 5.4.0 20160609] on linux
                              Type "help", "copyright", "credits" or "license" for more information.
                              >>> import boto
                              >>> import boto3
                              >>>


                              If you receive an error in python then you get the error in Ansible.






                              share|improve this answer














                              I had a similar issue when using aws_s3, the replacement module for s3.



                              Check to see if you have boto (for s3 and aws_s3) and boto3 (for aws_s3) correctly installed.



                              I had both boto & boto3 installed but, due to playing with virtual environments, they were only installed for Python3.5 and no other versions of python. Thus python (Python2.7 on my setup) that Ansible uses could not import the boto's and failed with this very esoteric error message.



                              To ensure that all is correctly installed is run python on the command line and try to import the boto's manually:



                              13:20 $ python
                              Python 2.7.12 (default, Nov 19 2016, 06:48:10)
                              [GCC 5.4.0 20160609] on linux2
                              Type "help", "copyright", "credits" or "license" for more information.
                              >>> import boto
                              >>> import boto3
                              >>>
                              13:21 $ python3
                              Python 3.5.2 (default, Sep 14 2017, 22:51:06)
                              [GCC 5.4.0 20160609] on linux
                              Type "help", "copyright", "credits" or "license" for more information.
                              >>> import boto
                              >>> import boto3
                              >>>


                              If you receive an error in python then you get the error in Ansible.







                              share|improve this answer













                              share|improve this answer




                              share|improve this answer



                              share|improve this answer










                              answered Oct 28 '17 at 12:47









                              Norm1710Norm1710

                              1




                              1


























                                  0


















                                  Add task: above - name like this:



                                  ---
                                  - hosts: localhost
                                  tasks:
                                  - name: Copy to s3
                                  s3:
                                  aws_access_key: "{{ lookup('env','aws_key') }}"
                                  aws_secret_key: "{{ lookup('env','aws_secret') }}"
                                  bucket: "{{ aws_packages_bucket }}"
                                  object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
                                  dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
                                  mode: put
                                  overwrite: no





                                  share|improve this answer

































                                    0


















                                    Add task: above - name like this:



                                    ---
                                    - hosts: localhost
                                    tasks:
                                    - name: Copy to s3
                                    s3:
                                    aws_access_key: "{{ lookup('env','aws_key') }}"
                                    aws_secret_key: "{{ lookup('env','aws_secret') }}"
                                    bucket: "{{ aws_packages_bucket }}"
                                    object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
                                    dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
                                    mode: put
                                    overwrite: no





                                    share|improve this answer































                                      0














                                      0










                                      0









                                      Add task: above - name like this:



                                      ---
                                      - hosts: localhost
                                      tasks:
                                      - name: Copy to s3
                                      s3:
                                      aws_access_key: "{{ lookup('env','aws_key') }}"
                                      aws_secret_key: "{{ lookup('env','aws_secret') }}"
                                      bucket: "{{ aws_packages_bucket }}"
                                      object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
                                      dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
                                      mode: put
                                      overwrite: no





                                      share|improve this answer
















                                      Add task: above - name like this:



                                      ---
                                      - hosts: localhost
                                      tasks:
                                      - name: Copy to s3
                                      s3:
                                      aws_access_key: "{{ lookup('env','aws_key') }}"
                                      aws_secret_key: "{{ lookup('env','aws_secret') }}"
                                      bucket: "{{ aws_packages_bucket }}"
                                      object: "/JI79IML/my_part_X86_64_c7.15.tar.gz"
                                      dest: "/data/parts/JI79IML/my_part_X86_64_c7.15.tar.gz"
                                      mode: put
                                      overwrite: no






                                      share|improve this answer















                                      share|improve this answer




                                      share|improve this answer



                                      share|improve this answer








                                      edited Mar 15 at 10:02









                                      Stephen Kitt

                                      207k27 gold badges491 silver badges556 bronze badges




                                      207k27 gold badges491 silver badges556 bronze badges










                                      answered Mar 15 at 9:42









                                      AnubhavAnubhav

                                      1




                                      1


























                                          0


















                                          this source maybe helps you



                                          ---
                                          - name: Bucket copy
                                          hosts: localhost
                                          become_method: sudo
                                          become_user: root
                                          become: true
                                          gather_facts: False
                                          vars_files:
                                          - varlist.yml
                                          tasks:
                                          - name: Get s3 objects # Make list of directory and files in register
                                          aws_s3:
                                          bucket: "{{ Bucket_name }}"
                                          mode: list
                                          aws_access_key: "{{ aws_access_key }}"
                                          aws_secret_key: "{{ aws_secret_key }}"
                                          register: s3_object_list
                                          - name: Create download directory # Create directory for download latest code on s3 bucket
                                          file:
                                          path: "S3/{{ item }}"
                                          state: directory
                                          with_items:
                                          - "{{ s3_object_list.s3_keys }}"
                                          ignore_errors: true
                                          - name: Download s3 objects # Download files in there appropriate directory on serverside
                                          aws_s3:
                                          bucket: "{{ Bucket_name }}"
                                          object: "{{ item }}"
                                          mode: get
                                          dest: "S3/{{ item }}"
                                          aws_access_key: "{{ aws_access_key }}"
                                          aws_secret_key: "{{ aws_secret_key }}"
                                          with_items:
                                          - "{{ s3_object_list.s3_keys }}"
                                          ignore_errors: true
                                          - name: Folder permissions
                                          file:
                                          path: S3/*
                                          state: touch
                                          mode: "u=rw,g=r,o=r"





                                          share|improve this answer









                                          New contributor



                                          Shaha Nawaj Mulla is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                                          Check out our Code of Conduct.


























                                            0


















                                            this source maybe helps you



                                            ---
                                            - name: Bucket copy
                                            hosts: localhost
                                            become_method: sudo
                                            become_user: root
                                            become: true
                                            gather_facts: False
                                            vars_files:
                                            - varlist.yml
                                            tasks:
                                            - name: Get s3 objects # Make list of directory and files in register
                                            aws_s3:
                                            bucket: "{{ Bucket_name }}"
                                            mode: list
                                            aws_access_key: "{{ aws_access_key }}"
                                            aws_secret_key: "{{ aws_secret_key }}"
                                            register: s3_object_list
                                            - name: Create download directory # Create directory for download latest code on s3 bucket
                                            file:
                                            path: "S3/{{ item }}"
                                            state: directory
                                            with_items:
                                            - "{{ s3_object_list.s3_keys }}"
                                            ignore_errors: true
                                            - name: Download s3 objects # Download files in there appropriate directory on serverside
                                            aws_s3:
                                            bucket: "{{ Bucket_name }}"
                                            object: "{{ item }}"
                                            mode: get
                                            dest: "S3/{{ item }}"
                                            aws_access_key: "{{ aws_access_key }}"
                                            aws_secret_key: "{{ aws_secret_key }}"
                                            with_items:
                                            - "{{ s3_object_list.s3_keys }}"
                                            ignore_errors: true
                                            - name: Folder permissions
                                            file:
                                            path: S3/*
                                            state: touch
                                            mode: "u=rw,g=r,o=r"





                                            share|improve this answer









                                            New contributor



                                            Shaha Nawaj Mulla is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                                            Check out our Code of Conduct.
























                                              0














                                              0










                                              0









                                              this source maybe helps you



                                              ---
                                              - name: Bucket copy
                                              hosts: localhost
                                              become_method: sudo
                                              become_user: root
                                              become: true
                                              gather_facts: False
                                              vars_files:
                                              - varlist.yml
                                              tasks:
                                              - name: Get s3 objects # Make list of directory and files in register
                                              aws_s3:
                                              bucket: "{{ Bucket_name }}"
                                              mode: list
                                              aws_access_key: "{{ aws_access_key }}"
                                              aws_secret_key: "{{ aws_secret_key }}"
                                              register: s3_object_list
                                              - name: Create download directory # Create directory for download latest code on s3 bucket
                                              file:
                                              path: "S3/{{ item }}"
                                              state: directory
                                              with_items:
                                              - "{{ s3_object_list.s3_keys }}"
                                              ignore_errors: true
                                              - name: Download s3 objects # Download files in there appropriate directory on serverside
                                              aws_s3:
                                              bucket: "{{ Bucket_name }}"
                                              object: "{{ item }}"
                                              mode: get
                                              dest: "S3/{{ item }}"
                                              aws_access_key: "{{ aws_access_key }}"
                                              aws_secret_key: "{{ aws_secret_key }}"
                                              with_items:
                                              - "{{ s3_object_list.s3_keys }}"
                                              ignore_errors: true
                                              - name: Folder permissions
                                              file:
                                              path: S3/*
                                              state: touch
                                              mode: "u=rw,g=r,o=r"





                                              share|improve this answer









                                              New contributor



                                              Shaha Nawaj Mulla is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                                              Check out our Code of Conduct.









                                              this source maybe helps you



                                              ---
                                              - name: Bucket copy
                                              hosts: localhost
                                              become_method: sudo
                                              become_user: root
                                              become: true
                                              gather_facts: False
                                              vars_files:
                                              - varlist.yml
                                              tasks:
                                              - name: Get s3 objects # Make list of directory and files in register
                                              aws_s3:
                                              bucket: "{{ Bucket_name }}"
                                              mode: list
                                              aws_access_key: "{{ aws_access_key }}"
                                              aws_secret_key: "{{ aws_secret_key }}"
                                              register: s3_object_list
                                              - name: Create download directory # Create directory for download latest code on s3 bucket
                                              file:
                                              path: "S3/{{ item }}"
                                              state: directory
                                              with_items:
                                              - "{{ s3_object_list.s3_keys }}"
                                              ignore_errors: true
                                              - name: Download s3 objects # Download files in there appropriate directory on serverside
                                              aws_s3:
                                              bucket: "{{ Bucket_name }}"
                                              object: "{{ item }}"
                                              mode: get
                                              dest: "S3/{{ item }}"
                                              aws_access_key: "{{ aws_access_key }}"
                                              aws_secret_key: "{{ aws_secret_key }}"
                                              with_items:
                                              - "{{ s3_object_list.s3_keys }}"
                                              ignore_errors: true
                                              - name: Folder permissions
                                              file:
                                              path: S3/*
                                              state: touch
                                              mode: "u=rw,g=r,o=r"






                                              share|improve this answer









                                              New contributor



                                              Shaha Nawaj Mulla is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                                              Check out our Code of Conduct.








                                              share|improve this answer




                                              share|improve this answer



                                              share|improve this answer






                                              New contributor



                                              Shaha Nawaj Mulla is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                                              Check out our Code of Conduct.








                                              answered 15 mins ago









                                              Shaha Nawaj MullaShaha Nawaj Mulla

                                              11 bronze badge




                                              11 bronze badge




                                              New contributor



                                              Shaha Nawaj Mulla is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                                              Check out our Code of Conduct.




                                              New contributor




                                              Shaha Nawaj Mulla is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
                                              Check out our Code of Conduct.




































                                                  draft saved

                                                  draft discarded



















































                                                  Thanks for contributing an answer to Unix & Linux Stack Exchange!


                                                  • Please be sure to answer the question. Provide details and share your research!

                                                  But avoid



                                                  • Asking for help, clarification, or responding to other answers.

                                                  • Making statements based on opinion; back them up with references or personal experience.


                                                  To learn more, see our tips on writing great answers.




                                                  draft saved


                                                  draft discarded














                                                  StackExchange.ready(
                                                  function () {
                                                  StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2funix.stackexchange.com%2fquestions%2f355613%2fcopy-files-to-aws-s3-bucket-using-ansible%23new-answer', 'question_page');
                                                  }
                                                  );

                                                  Post as a guest















                                                  Required, but never shown





















































                                                  Required, but never shown














                                                  Required, but never shown












                                                  Required, but never shown







                                                  Required, but never shown

































                                                  Required, but never shown














                                                  Required, but never shown












                                                  Required, but never shown







                                                  Required, but never shown









                                                  Popular posts from this blog

                                                  Taj Mahal Inhaltsverzeichnis Aufbau | Geschichte | 350-Jahr-Feier | Heutige Bedeutung | Siehe auch |...

                                                  Baia Sprie Cuprins Etimologie | Istorie | Demografie | Politică și administrație | Arii naturale...

                                                  Ciclooctatetraenă Vezi și | Bibliografie | Meniu de navigare637866text4148569-500570979m