I'm newbie in using eucalyptus. I need to download modify a prepackaged image. I use command
euca-download-bundle
to download it. It has a required parameter - bucket.
I want to download this image.
mi-E07C107C
image-store-1297468153/image.manifest.xml
eki-F6BE10FF
eri-0B3C116B
but I always get error -
Unable to get bucket
Here are the commands I tried:
euca-download-bundle -b image-store-1297468153
euca$ euca-download-bundle -b /var/lib/eucalyptus/bukkits/image-store-1297468153
you may not be able to download the bundle if you are not the same user who uploaded it. For example, if admin uploaded that bundle, but the user aleksey is the one trying to download it, that would be the error you get.
Related
I want to upload nested folders and files to a Google Cloud Bucket.
So far I used this command line to do so:
gsutil -m cp -R [dir] gs://[bucket]
it works, but when I go to Firebase console, I cannot generate an access token for the uploaded files.
If I upload the same file manually the access token is generated automatically.
I wonder if there's a way to make gsutil to upload in a manner that files will have access token.
Appreciate your hints and conversations
Download URLs are normally only generated by the Firebase SDK for Cloud Storage.
Luckily though somebody figured out that if you set the right metadata for a file, that gives it a download URL too. Since metadata can be set through gsutil, that means it should be possible to also generate download URLs like that.
See:
Get Download URL from file uploaded with Cloud Functions for Firebase, for the metadata to set
The gsutil documentation on setting metadata
My goal is to update 1 file on my Wordpress website on Google Cloud Engine using Filezilla to transfer it.
I am successfully logged into my files using SFTP. I'm on a Mac. I have my vm instance name from Google Cloud Engine but cannot find how to create a password.
I think if I can figure out how to create a ding dang password my next step is to type this in terminal:
sudo chwown "vm-name" /var/www/html
Any direction is much appreciated. My website has been down since yesterday b/c I messed with https plugin. I'm a designer and got in way over my head. Learned more than bargained for so far.
To run any linux command inside your instance you need to:
Generate a private/public key pair using Puttygen.
You need to add the content of your public key to GCP as in below answer.
How can i setup remote project with PhpStorm with Google Compute Engine running LEMP?
You need to open Putty.
In the Session menu enter yourusername#instance_external_ip as a hostname using connection type SSH
In the menu Connection/SSH/Auth browse and select your private key(the one generated at step 1).
Click Open. A new terminal window will open. You will be abel to navigate inside your linux instance and do whatever sudo commands you wish.
Good luck
Using FTP, everything is ok... but..
When i use SFTP, it successfully connects, and even, when i use "UPLOAD BUTTON", it successfully uploads file...
but when i edit+save file, and it starts to upload changes automatically, it cant upload (red message: file........... upload failed)..
Check to ensure that the specific file you are trying to upload has the correct permissions. The user should be able to write.
The file you are trying to upload should have read and write permission for the user you are using and also check that the owner name & owner group for the file are correct.
I faced this issue and the problem was the user which i was using was not in the owner group for which the file permissions were given.
You should have set correct External Path.
when using SFTP, external directory should start with: /home/user/MY_SITE
(unlike FTP: /MY_SITE)
uploading images from desktop is fine but the problem is when i need to upload image from
mobile browser give me that error
the file 14343254.jpg could not be saved . an unknown error has occurred
the file in the Photo field was unable to be uploaded
photo field is required
I suggest you to use the devel module for that and in hook_file_presave($file) you should use dpm($file) to test if you get the file object. Try it like this:
function MYMODULE_file_presave($file){
dpm($file);
}
If file_presave doesn't work try the hook_file_insert($file).
Both will run if you press the upload button next to the file field or just hit the save button on node edit/add page and will output the file you uploaded to the screen as an object. If you don't have the devel module get it from here: http://drupal.org/project/devel or if you have drush drush dl devel. Try to check the server error log or the recent messages in drupal at Reports. You may have a duplicate entry for the file uri in the database. Check if you modify the uploaded files in one of your custom modules. I hope it helps you a bit.
i finally solved my problem after 1 week of searching
The problem is i am using aws ec2 server with 8GB volume and its 70% filled so images from mobile camera is large its generate unknown error so when i increase my volume to 50GB the problem is solved
I'm using symfony2 framework in an application and it works fine. I'm having a problem on deployment though:
I have an upload directory in my testing area. The directory is web/uploads/images. This folder has the following permission: 700. However, each time I upload a new image in there, the permission on the image is 600. This means that the image won't show on my application. Of course, when I then set the permission manually on the newly uploaded image, it shows correctly.
I try using the umask but it didn't really work. My guess is that maybe I need to do something on symfony2 when I upload the image.
Thanks for your answer
you need to use umask as the user that you're running the symofony application as.
look here:
http://www.cyberciti.biz/tips/understanding-linux-unix-umask-value-usage.html
Use symfony filesystem component link . After moving the image to the directory use chmod from the above component and set the needed permissions to the moved image.