understanding on Datastore in bitbake - bitbake

I am new to bitbake.The question might sound simple but if you could clarify that will be good help to me.
I am going through below link for learning
https://www.yoctoproject.org/docs/1.6/bitbake-user-manual/bitbake-user-manual.html
I got a doubt in the screenshot below
It says from external environment i have to load necessary values to datastore and from there export to task environment.
As per my understanding it is something like from my PC i have load in necessary data to a structure called datastore and that should be loaded in bitbake environment for it to get processed by bitbake.
But i am not able to get a clear overview.
Can you explain me (if possible with example) what is extternal environment variables,internal environment variable and datastore?
Thank you

Related

How to handle environment variables on a deployed SF4 application

Symfony introduced a new Dotenv component since Symfony 3 which allows us to handle environment variables as application parameters. This looks really nice and it's the best practice to follow according to 12factor app manifesto.
Now, regarding Symfony 4 they went further by pushing forward this practice and that's why I started using environment variables via the .env file.
And then I wanted to deploy and I realized that the .env file must not be persisted on the server as it would be the same as having a parameters.yml file.
So I've been digging into the documentation a bit and I found this article which explains that we can directly create environment variables via some webserver directives. That's great for code being executed via FPM but it does not tell us how to handle environment variables when running a command via the CLI for instance.
How can I achieve this ?
Should there be an equivalent of a .env file stored somewhere? But then parameters would be duplicated ?
I'm welcoming any help ;)
Finally had the time to check the link Neodan posted and everything is in there!
So for those of you wondering what to do, simply edit the /etc/environment file and add your variables. Then reboot your server and all your processes will have access to these variables.
I guess that's the simplest solution. The only drawback of this method is that these variables are available by any process / users but that's ok as far as I'm concerned.
If you want a more secure solution I suppose that you could, as I stated before, configure your webserver to add environment variables and export them via your .bash_profile or .bashrc file but be careful about how you start your shell (when deploying your application for instance). It's more complicated to maintain and prone to errors I'd say.
N.B.: You also might want to be careful about how you name your variables to prevent collisions.

How does one turn on diagnostic and/or progress logging with firebase-tools run as a node module?

I have not been able to find any reference for the options passed into the node module version of firebase-tools. How do one turn on diagnostic logging or progress output? The github README for firebase tools only says:
The Firebase CLI can also be used programmatically as a standard Node module. Each command is exposed as a function that takes an options object and returns a Promise.
and has only the example:
client.deploy({
project: 'myfirebase',
token: process.env.FIREBASE_TOKEN,
cwd: '/path/to/project/folder'
}).then(function() {...
It would be really nice to get complete docs. The source code wasn't much help.
There isn't a good way to see progress via the programmatic API for the Firebase CLI right now. Your best bet would be to instead use spawn or similar to run it as a process and simply capture the stdout.
We'd like to improve this in the future but there are no concrete plans of what it will look like yet.
To see the complete list of keys off of the client object, see commands/index.js
In terms of what options to pass in, that is definitely hard to figure out. This seems like a great chance to submit an issue requesting specific improvements to documentation, or take a shot at documenting it yourself and submit a PR.

Writing an appspec.yml File for Deployment from S3 (and/or Bit Bucket) to AWS CodeDeploy

I'd like to make it so that a commit to our BitBucket repo (or S3 Bucket) automatically deploys code (using CodeDeploy) to our EC2 instances. I'm not clear what to use for the 'source' and 'destination' entry under the 'files' section in the appspec.yml file and also I am not cleared what to mention in BeforeInstall and AfterInstall under 'Hooks' section. I've found some examples on Google and AWs documentation but I am confused what to mention in above fields. The more I am exploring more I am getting confused.
Consider I am new to AWS Code Deploy.
Also it will be very helpful if someone can provide me step y step link how to configure and how to automate the CodeDeploy.
I was wondering if someone could help me out?
Thanks in advance for your help!
Thanks for using CodeDeploy. For new users, I'd like to recommend the following things to do:
Try to run First Run Wizard on console, it will should you the general process how the deployment goes. It also provide a default deployment bundle, also an appspec file included.
Once you want to try a deployment yourself, the Get Started doc is a great place to help you with some pre-requiste settings like IAM role
Then probably try some tutorials for a sample app too, which gives you some idea about deployment groups, deployment configuration, revision and so on.
The next step should be create a bundle for your own use cases, Appspec file doc would be a great place to refer. And for your concerns about BeforeInstall and AfterInstall, if your application doesn't need to do anything, the lifecycle events can be left as empty. BeforeInstall can be used to for for preinstall tasks, such as decrypting files and creating a backup of the current version, while AfterInstall can be used for tasks such as configuring your application or changing file permissions.
Now it comes to the fun part! This blog talks about details about how to integrate with Github(similar for Bitbucket). It's a little long, but really useful, and it also includes how to do automatically deployment once there is a new pushed commit. Currently Jenkins and CodePipline are really popular for auto-triggered deplyoments, but there are always a lot of other ways can achieve the same purpose like Lamda and so on

Alfresco failed to find new folder

I'm using alfresco throw cmis.
On one of our environment, we have an issue.
We want to create a folder and put some docs in it.
This works fines in all our env except one.
In this one, we can create the folder.
But when we do a search to find the folder, the folder isn't found.
After that i can find it with the share gui.
I have no error message in the share app.
Does any one have an idea on what could be the issue?
Promoting a comment to an answer...
When using Alfresco with SOLR, you need to be aware that the SOLR index isn't quite real-time. Close to real time, sure, but it's asynchronous so there's always a lag. (It's an eventually consistent index, not a fully realtime one)
There's a lot of information on the Alfresco and SOLR Wiki, including the way you can query what the current lag is.
If the lag is very low (eg a lightly loaded system), you can find that SOLR will catch up almost instantly, and newly created items will show instantly in the search results. However, it's more normal to expect to have to wait a little bit, especially on more loaded systems.
If no new results are showing up even after several minutes, you'll want to follow the instructions on the wiki or the SOLR Monitoring and Troubleshooting docs to work out why and fix.

Using the simpletest automator in Drupal 6

I've been trying to learn how to use simpletest, and I found the simpletest automator. I was able to install it and run it, but where is the file with the results of the 'macro' saved? I haven't been able to find it.
Also, is there a quick way to duplicate a drupal install in simpletest? I know it starts from a clean install, but I don't want to have to go through and figure out what all is enabled and who has what permissions at the start of the test. Is there a script that can figure out the settings of the current drupal install?
Thank You.
Is there a script that can figure out the settings of the current drupal install?
The short answer is no.
Essentially simpletest should be used as a unit test framework. Where all of the data that is needed is set up at the beginning of the test and it is not reliant on system setting or a particular user having a permission. It does this quite well, and can test core functionality and individual modules easily. If you are testing an indavidual module you have written using simpletest is, well, simple.
Unfortunately most websites use a number of modules and are configured to work together in a very specific way. Simpletest doesn't cope with this very well.
There are ways to get around this:
One option is to write a setup script in php which will work as a big setup script for your test. This can create users, set settings and permissions. This can be dificult to write and maintain and can cause the tests to take a long time to run.
Another option is for the site testing (which is different from unit testing) to be done in a tool other than simpletest. I have had some success with selenium. The downside to this is that you need to find a way have clean data. Which can be tricky, copying a database works but doesn't scale.
I've been pointed to this blog post as an answer to the question: http://www.trellon.com/content/blog/forcing-simpletest-use-live-database
You can also use a site deployment module and enable only that at the very beginning of your test (in your SetUp() function).

Resources