I have a simple lubuntu server where I setup a ssh server. While it is sufficient to access a driver/folder from another linux machine using sftp, it is a pain from Windows & Mac, so unfortunately, samba is still needed here.
For simplicity I needed only one samba user that I could map to a single user on my lubuntu machine
Following the various tutorial I ended up doing this with simple commands
Install samba $ sudo apt-get install samba
Add myuser to the samba $ sudo smbpasswd -a myuser
Export the drive I needed (no guest user) $ sudo vi /etc/samba/smb.conf
to add the following export (no real option needed here) [files]
path = /media/music/files
writeable = yes
save and Restart samba $ sudo service smbd restart
As I was facing this again trying to create amp pages for my main site www.tekartik.com, here is a concrete example for my changes
Adding the proper DNS redirection in my domain provider (OVH in my case)
www.tekartik.com. CNAME c.storage.googleapis.com.
This was first giving the error:
<Details>Anonymous callers do not have storage.objects.list access to bucket www.tekartik.com.</Details>
the solution was to configure my bucket as a website configuration:
gsutil web set -m index.html -e 404.html gs://www.tekartik.com
And to make all files I add public by default and not worry about ACL anymore, I can use
gsutil defacl ch -u AllUsers:R gs://www.tekartik.com
To make previous imported files public, I can use
gsutil -m acl -r set public-read gs://www.tekartik.com
I have several static websites hosted on Google Cloud Storage for which the only thing missing is the ability to have a custom domain using SSL.
In my deploying process from the local file system I was using the gsutil cp command that enables gzipping the files by extension (html,css,js). However each time every file was copied which was a pain when I was using several unmodified images.
My first experience with gsutil rsync was bad:
I could not gzip the files I neededJust touching a file was causing it to be re-deployed
However I was able to find an optimized way for deployment Split my local folder into 2 folders with the same hierarchy, one containing the content to be gzip (html,css,js...), the other the other filesGzip each file in my gzip folder (in place)Call gsutil rsync in for each folder to the same gs destination
Of course, it is only a one way synchronization and deleted local files are not deleted remotely
For the gzip folder the command is gsutil -m -h Content-Encoding:gzip…