Skip to main content

Posts

Showing posts from June, 2020

Read or get the list of contents on S3 bucket

S3 or Simple Storage Service bucket is quite handy to store files or any data on AWS cloud. As with any storage, online or offline organizing the data is quite an issue. When there are a lot of files of data on the storage; finding the data that you will need will be difficult. Especially, if the data or bucket is not organized properly. And even if the data has been organized by folders with proper time stamp but no specific data catalog to classify which one is which one it is quite a challenge to find or get the data on time. S3 is ideal for backup, since backup is often accessed only when it is needed. So, familiarization of the backup folder structure is necessary. In order to find or get easily the data, listing the contents of the S3 bucket is a good option when unable to find the files or data needed. Once the list is ready, opening the list via editor such as notepad and searching or finding thru the editor will provide an option to find things quickly and easily. ...

PowerShell switch case call function

Code below collect user input and uses switch case statement to check if the input matches, and if it matches a function is called that will execute commands. Here’s the code: #get or read from user input $computer_name = read-host ( "Enter Computer Name:" )   switch ( $computer_name )   {     #if the input is computer_1 then function func_comuputer1 is called   computer_1     { func_computer1 }     computer_2     { func_computer2 }   }     #function called if computer_1 is the input function func_computer1 {   write-host "You entered Computer_1" #or replaced with other function like reboot / shutdown /or other commands #Restart-Computer -ComputerName computer_1   }   #function called if computer_2 is the input function func_computer2 {   write-host "You entered Computer_2" #or replaced with other function like reboot / shutdown /or other c...

How to upload to S3 using PowerShell

Uploading to S3 using PowerShell is quite easy. Here’s the code: $accesskey = '123' $secretkey = '456' $dregion =   'us-east-1'   Write-S3Object -AccessKey $accesskey -SecretKey $secretkey -Region $dregion -BucketName   -File C:\Temp\test-file.txt #Replace NameofthebucketinS3 with the folder name found in S3 # test-file.txt is the file that will be uploaded to S3 and is located on c:\Temp #Change the folder location c:\Temp and the file to be uploaded #Above code will upload the file to the root of the S3 Bucket     #If need to upload a file to a specific subfolder in S3 bucket used the code below   $accesskey = '123' $secretkey = '456' $dregion =   'us-east-1'   Write-S3Object -AccessKey $accesskey -SecretKey $secretkey -Region $dregion -BucketName NameofthebucketinS3/SubFolderBucket -File C:\Temp\test-file.txt   Use forward slash to indicate the sub folder path. So the syntax would be...

Linux find accessed and modified files

Finding accessed and modified files might be necessary at times to check or for audit purposes. If files kept in a folder or directory has been accessed or modified but should not be the case then something dubious is going on.  In Linux finding accessed and modified files can be done in a one liner command. find /home -type f -amin -60 || -mmin -60 -print Above command will find or show any files accessed within the last 60 minutes with the option "-amin" and it will show also the files modified within the last 60 minutes with the option "-mmin". A shell script can be created and further processing can be done when files are detected. The time can be adjusted if there's a need, but a more robust solution to check any accessed or modified files should be a file system watcher, but above command is quite helpful to check any activity that should not be occurring. Cheers..till next time!   ================================ Free Android Apps: Click  links below to f...