IBM i - Having fun with IBM Cloud Object Storage (COS)
Diego E. KESSELMAN
Helping Businesses Move IBM Power Workloads to the Cloud | IBM i/AIX/Linux Expert | IBM Champion 2022-2025 | "IBM i en Espa?ol" & "IBM_PowerVS_en_Espa?ol" (Telegram) Admin
In previous articles I've been talking about IBM Cloud Object Storage and how to connect from IBM i.
In this small article I'll try to make a short cheatsheat and give you some tips on how to ease your experience.
Installing AWS CLI on IBM i
pip3 install awscli
aws configure
Commands
2. For more commands type: aws s3 help?
How to use IBM COS on IBM i
Get the URL Endpoint for your buckets on IBM Cloud portal and use the option "--endpoint-url"
aws --endpoint-url=https://s3.us-south.cloud-object-storage.appdomain.cloud s3 l
#This command lists your buckets.
aws --endpoint-url=https://s3.us-south.cloud-object-storage.appdomain.cloud s3 cp file.txt s3://ibmitxts/file.tx
This copies the local file to the bucket "ibmitxts"
TIPS
alias ibmcos=aws --endpoint-url=https://s3.us-south.cloud-object-storage.appdomain.cloud s3
Now your commands will look similar to this:
领英推荐
ibmcos ls
ibmcos cp file.txt s3:/ibmitxts/file.txt
2. You can use piping (a way to chain commands used in Unix-Like environments) to make some tasks more efficient
Using piping to upload files?
system "WRKACTJOB" | aws --endpoint-url=https://s3.us-south.cloud-object-storage.appdomain.cloud s3 cp - s3://ibmitxts/wrkactjob_20220523.txt
This example list the content of WRKACTJOB and send the results to the bucket
A more complex example:
cat /QSYS.LIB/BACKUPSAV.LIB/MYSAVF.FILE | gzip -9 | aws --endpoint-url=https://s3.us-south.cloud-object-storage.appdomain.cloud s3 cp - s3://ibmitxts/MYSAVF.gz
This example sends the content from MYSAVF save file , compress and save the content in your bucket
3. You can use piping to read the content of a file in your bucket
aws --endpoint-url=https://s3.us-south.cloud-object-storage.appdomain.cloud s3 cp s3://ibmitxts/wrkactjob_20220523.txt -
This will show the content of this file and you can process this text stream with grep, cut, sed, awk or save to a file using ">" .?
You can also extract information from a compressed file while performing a download operation.
aws --endpoint-url=https://s3.us-south.cloud-object-storage.appdomain.cloud s3 cp s3://ibmitxts/wrkactjob_20220523.txt - |grep RUN > waj_20220523_RUN.txt
This will filter the content from the cloud and save only matching records to your file.
WARNING: This will read all the file content from the cloud. With large files will take some time and you will consume cloud credits.
4. Improving performance
You can make some adjustments to improve performance when copying files from/to your bucket.
Edit the file "config" in your /home/USER/.aws/ directory and add this to the [default] section
s3 =
?max_concurrent_requests = 20
?max_queue_size = 10000
?multipart_threshold = 256MB
?multipart_chunksize = 128MB
Now try again your uploads or downloads
As always, if any doubt arises just send me a message
Good luck !
IBM Power Systems & Storage AIX PowerVS IBM i Gestiono tus garantíasIBM y extensiones Nominado para IBM Beacon Award Contacto [email protected]
1 年aws
IBM Power Systems & Storage AIX PowerVS IBM i Gestiono tus garantíasIBM y extensiones Nominado para IBM Beacon Award Contacto [email protected]
1 年Setupe env ---> INSTALL OK pip3 install awscli --- CMD AWS not found ..CMD AWS not found .CMD AWS not found .