I have just started playing with Google cloud. I used to work on normal servers so I need advice.
I created my first instance and deployed Wordpress. I installed woocommerce plugin. The shop is quite fast and I am happy (with the lowest settings) but now:
I wanted edit function.php but I can't. The attributes are read only so How can I change it?
How to get access to my all files I can't see them in storage cloud. How to set up ftp?
What about database for my shop? I understand I can create new data base but where to access to current data base of my wordpress.
What should I deploy more to work comfortable with my wordpress?
About ssl
SNI SSL certificate slots are offered for no additional charge for
accounts that have billing activated. Free accounts are limited to 5
certificates.
I have no experience with ssl but I plan run shop so what it means. Free certificates for 5 instances or 5 deployement ? How many certificates do I need to run one shop?
I know there are many questions but I wanted to go further and all advise on internet is outdated because are for older versions of google cloud. Please help me to understand this all.
I assume you're attempting to use WordPress on Google App Engine.
GAE has no real filesystem, so you cannot write to it (unless you juggle with the API GAE offers). Editing happens locally using the GAE SDK development server and you deploy your changes to the App Engine ecosystem using the SDK interface (GUI or CLI). All application writes should go to Google Cloud Storage (which is similar to Amazon S3 and the like).
I'm not certain whether the Google Cloud Storage can be accessed via traditional FTP. There might be some middleware required. You can see and browse the contents of your buckets in the developer project console (https://console.developers.google.com/).
The databases are on a separate "server" when using GAE. MySQL instances are spawned into the Google Cloud SQL ecosystem, which are available for App Engine and Compute Engine instances (and why not other places too). You can define the GCSQL address and port to wp-config.php like normally. You need to create a local MySQL database for your local installation. More: https://cloud.google.com/appengine/docs/php/cloud-sql/
When working with Google App Engine you should deploy the whole WordPress installation (wp-config.php, wp-includes/, wp-admin/, wp-content/, etc.) in order for it to work in the GAE system. For a "better" deployment system you should do some searching or ask a new question dedicated for that issue.
The certificates themselves on GAE are not free, but the "slots" you put the certificates into are. Free projects (no billing enabled) offer 5 free slots where you can put your purchased certificates. SSL SNI means that you can use multiple different domain/host certificates under a single listening IP address (which some years back was not that simple to do). What this all means that GCP offers a way to use certificates with their services, but you still need to get the certificates themselves elsewhere.
Have you seen the GAE starter project offered by Google: https://googlecloudplatform.github.io/appengine-php-wordpress-starter-project/ ? It makes your live a bit easier when developing WP sites for Google App Engine.
If you're working with Google Compute Engine instances, then they should operate just like regular VPS machines, with some Google restrictions applied. I have not used them so I do not know the specifics.
Related
I have 6 google VM instance which have been shut down due to owing, I currently can't pay but need access to some particular files requested by an important client.
I need your help to gain access to the directory in which my WordPress files are installed.
There is no way to access your files unless you communicate your issue with the Google Cloud Platform through its Support. Contact the billing support team through a Support Ticket, Chat, Mail or Phone.
You may also report your issue with Issue Tracker. Please make sure to file your ticket under the right component.
I have been looking for many blog and site to deploy the Wordpress website multi-region on cloud platform.
I have go through GCP App Engine and Kubernetes but didn't find much.
How to create a database connection from another region and how to manage WordPress media and sync them across the regions. also i am looking for auto-scaling on website.
For database we can use cross region read replica but how to handle the media data and sync them across all the instances in different regions.
To deploy highly available and scalable wordpress architectures on AWS, I would suggest to read this white paper https://aws.amazon.com/blogs/architecture/wordpress-best-practices-on-aws/
The key to multi region deployment is to have a copy of the data in both regions. This comes with a lot of challenges if you do consider to have two database masters, i.e. where the write operations can happen (In wordpress words, a write happens when you author a post or when customers are leaving comments)
Having a cross-region read replica is possible with Amazon RDS since 2013 : https://aws.amazon.com/blogs/aws/cross-region-read-replicas-for-amazon-rds-for-mysql/
For master-master setup, have a look at Amazon Aurora Global Database (compatible with MySQL) : https://aws.amazon.com/rds/aurora/global-database/ But i would seriously question why you want to do that first.
[UPDATE 17 July 2019]
I just found out that the bitnami distribution of Wordpress has a documentation explaining how to use S3 for media files : https://docs.bitnami.com/aws/apps/wordpress-pro/configuration/wordpress-aws-s3/
I will post this answer even though there are better answers. This answer provides additional information on one particular design.
I have deployed multi-region WordPress on both AWS and Google Cloud. WordPress is simply not designed for this. Unless you have money to spend and IT talent, choose a company that offers distributed WordPress managed. You will save money and headaches.
For the last project, the company did not require instant updates/synchronization globally but required high traffic loads that could be predicted. We decided upon separate WordPress systems in each region. We wrote software that ran once per hour to synchronize the WordPress content between systems. This involved syncing the file system wp-uploads directory, moving static assets to cloud storage behind a CDN and copying content changes to each MySQL database. If there was a conflict, an email was sent to an admin to manually review. Once per day software ran that compared the newest posts to verify content and synchronization between servers.
The systems in each region were load-balanced and auto-scaled. The database was hosted separately on managed MySQL servers. The WordPress directory was hosted on an NFS share. We used cloud storage + CDN for static assets (css, js, images, downloads). Except for cloud storage, we did not share assets between regions. Each region was independent. Each region has at least two servers running at all times. During forcasted peak loads (marketing releases, events, etc.) we would scale up/down each group based on timezone via a GUI click to prewarm the systems.
Currently I have my personal sites on media temple grid and I just FTP everything up to the server manually. I currently pay $20 a month for this service but I am willing to up the payments slightly for something more comprehensive (for one for some reason I can't upgrade my grid server to PHP7).
When doing research I realized how little I know about how this whole infrastructure works. At work we use beanstalk which allows me to see diffs and deploy from a GUI which I like but do they handle all of the hosting as well or will I have to integrate this to some hosting service like digital Ocean? (at work we have a server vendor that does all of this so I'm in the dark about that).
Basically what I need to host is a couple of wordpress sites and a couple of Laravel Apps. I would like recommendations on hosting and environments like beanstalk (for Laravel I have also heard of Forge). Do I need to get a hosting provider and then a separate service like Forge and beanstalk on top of that?
Without much knowledge of setting up your own environments Digital Ocean & Forge would be the easiest option for you to get up and running.
https://mattstauffer.co/blog/getting-your-first-site-up-and-running-in-laravel-forge
>>>> BACKGROUND ON THE ISSUE <<<<
We were using Google Apps for Business when we started with the project. This allowed us to use the Google Developer Console (https://console.developers.google.com/) with our #company.co.za accounts and also to “login with Google” using our #company.co.za accounts. It turns out that the Google Developer Project (where the API keys are) was created using an ex-colleague's #company.co.za Google account.
When we moved from Google Apps for Business to Office 365, we lost the ability to login to the Google Developer Console with our #copany.co.za accounts. By then the colleague wasn’t working here anymore and I guess it all happened so quickly that we didn’t make sure to tie up all the loose ends.
Now we need to transfer development of the app and subsequently all related 3rd party projects and things, to the client for future development, but I cannot access the Google project.
This will require them to create a project on their side, generate new API keys for using the Google Maps API etc. And then update the apps (Android and iOS) with the new API keys.
>>>>> THIS IS MY QUESTION <<<<<
However, and this is where my question comes in, the apps are still working and happily accessing Google Maps. This makes me think that the project must still be somewhere.
I tried to access the Google Help pages, but because we're on a Bronze package, we can only find support information in their developer communities and online documentation listed here:
Join a Community
Service Disruption Notification
Best Practice Guides
But I thought to ask here too because SO is very reliable with answers :)
So, any idea if the project is still live somewhere? Or should we just create a new project with new API keys?
I don't think this is a stackoverflow question since it's not directly programming related. However, if you want to regain access to you project, this should be fairly simple and i hope this helps:
Create a new Google Apps for Business account with your domain (or maybe your old apps domain still exists?) and a single account. If you choose monthly payment the costs will be only a few bucks.
As the Google Apps domain administrator you should be able to access all appengine projects that belong to users of the same domain. If it doesn't you can contact Google support to reassign the projects. Alternatively: You may be able to recreate an account that owned the app. If you don't know the owner email, it is shown in the consent screent for oauth. With that email, try to access the project.
Create a Gmail account and transfer project ownership to this Gmail account
Delete the Google apps domain to avoid additional costs
All this is doable within an hour.
In case it doesn't work i would suggest you contact your Google sales representative or reseller and tell him, that you would like to purchase silver level support, but only if he can restore the permissions for you. This will cost you more, but if you have to access the project it may be the only way.
Last but not least:
You can contact Google support. You don't need silver level support for that. It will just take a lot longer to process your request. While in contact with the Google support you will have to prove that you are in fact the owner of the domain, which is usually done by adding a TXT record to your domain or uploading a file to your web server, so make sure you have access to your domain's DNS zone files / settings or web server document root.
Some of the problems that can happen are timeouts, disconnections, and not being able to resume a file and having to start from the beginning. Assuming these files are up to around 5gigs in size, what is the best solution for dealing with this problem?
I'm using a Drupal 6 install for the website.
Some of my constraints due to the server setup I have to deal with:
Shared hosting with max 200 connections at a time (unlimited disk space)
Shared hosting.
Unable to create users through an API (so can't automatically generate ftp accounts)
I do have the ability to run cron-type scripts via a Drupal module.
My initial thought was to create ftp users based off of Drupal accounts and requiring them to download an ftp client for their OS of choice. But the lack of API to auto-create ftp accounts and the inability to do it from command line kind of hinder that solution. If there's a workaround someone can think of, let me know!
Thanks
Usually, shared hostings does not support large files uploads through the browser. A solution may be to use another files hosting for your large uploads. A nice an easy solution to integrate is Amazon S3 and its browser based upload with a from POST.
It can be integrated in a custom module that provide an upload form protected using Drupal access control. If you require the files to be hosted on the Drupal server, you can use cron (either Drupal's or an external one) to move the files from S3 to your own hosting.
You're kind of limited in what you can do on a shared host. Your best option is probably to install SWFUpload and hope there aren't a lot of mid-upload errors.
Better options that you probably can't use on a shared host include the upload progress PHP extension (which Drupal automatically uses when it's installed) and, as you said, associating FTP accounts with Drupal accounts.