I've the following environment: One Apache2 Webserver on an Ubuntu machine with three vhosts (one vhost per project). Those three project run all on Symfony2 (but differnt versions, from 2.2 to 2.4). Each of the projects (and its paths) have an own user. I'm deploying the projects with capifony on this server, each has an own receipt.
Sass version
Sass 3.2.14 (Media Mark)
Imagine I'm deploying application 1 to the Webserver. When the deploymentprocess comes to dumping all production assets, its writing the temporary generated sass files into the following folder:
/tmp/600d657f6ac2358f30ba6bc0ab4cd7ffb6194ced
as user1.
If I'm deploying now application 2 to the Webserver, dumping the assets would like to write into the exactly same folder, this time as user2 and the following error occurs:
An error occurred while running:
* [err :: 10.0.106.103] '/usr/bin/ruby' '/usr/local/bin/sass' '--load-path' '/srv/vhosts/myproject.com/releases/20140619124055/app/../web/sass' '--scss' '--cache-location' '/tmp' '/tmp/assetic_sassbsrcle'
* [err :: 10.0.106.103]
* [err :: 10.0.106.103] Error Output:
Errno::ENOENT: No such file or directory - /tmp/600d657f6ac2358f30ba6bc0ab4cd7ffb6194ced/assetic_sassbsrclec20140619-27927-aw8xrk.lock
My workaround currently is, to remove this /tmp/600d657f6ac2358f30ba6bc0ab4cd7ffb6194ced folder every time before deployment.
I didn't find any pathconfiguration in the capifony receipts nor in the Symfony2 config files in all projects.
Any help is appreciated.
Best,
Ramo
This has been referenced on the official assetic repository as an issue, as well. Since you can set sys_get_temp_dir() via the environment variable TMPDIR (among others), I would recommend doing that for your dump. You could base it off the current unix time, or the commit you're deploying, or a combination of the application, time, and intent. Really, anything could work. The line responsible for setting the cache location is here, in case you wanted to fork Assetic and change that, which is quite doable as well. I would suggest the TMPDIR route at first to confirm a potential fix.
Small edit, there is also this pull to semi-address this issue.
Related
I am curious if you can control the output "src" folder in AWS CodeBuild.
Specifically, I see this when debugging the build in CodeBuild.
/codebuild/output/src473482839/src/github.....
I would love to be able to set/change/remove the src473482839 part of that path, because I have a feeling it is causing my sbt to recompile my scala source files, although I am using CodeBuilds new localcache to cache my target folders between builds, the compiled class's canonical path change between builds, which is what I suspect is causing the problem
After some more debugging I have managed to get my 6 minute builds down to 1:30s.
Although you are not able to set or override the CODEBUILD_SRC_DIR I have found a work around in my buildspec.
This is what my buildspec looks like now, with local caching enabled in codebuild.
version: 0.2
phases:
pre_build:
commands:
- mkdir -p /my/build/folder/
- cp -a ${CODEBUILD_SRC_DIR}/. /my/build/folder
build:
commands:
- cd /my/build/folder
- sbt compile test
cache:
paths:
- '/root/.ivy2/cache/**/*'
- '/root/.cache/**/*'
- 'target/**/*'
- 'any other target folders you may need'
The key change I had to make was copy over the source(cached target directories) in the pre_build phase, and change directory and compile from the new, static directory
I hope this helps someone else down the road until CodeBuild allows a person to set/override the CODEBUILD_SRC_DIR folder
We have a monorepo that uses Yarn’s ‘workspaces’ feature, meaning that whenever possible, Yarn will hoist dependencies to the monorepo's root node_modules directory rather than keep them in the individual package's node_modules dir. This relies on Node’s module resolving algorithm, which continues to search for modules in node_modules directories up the dir tree until it finds the required module.
When using Flow types in a file that imports another package (internal or external to the monorepo), running Flow inside the package that contains that file causes a Cannot resolve <package-name> error to be thrown. It seems like Flow uses a different module resolving algorithm, and fails since the installed modules are hoisted to the root dir and Flow does not continue to search up the dir tree.
Is there a way around this other than running Flow from the root? Running from the root is less than optimal because it does not allow different settings for different packages in the monorepo.
Node version: 10.8.0
flow-bin version: 0.78.0
I also ran into this problem
To fix it need update .flowconfig:
[include]
../../node_modules/
FS struct:
/project_root
--/node_modules
--/packages
----/module1
------.flowconfig
Pick the components to be hoisted by hand with a directive like:
"nohoist": ["**/npm-package", "**/npm-package/**"]
or select them with an exclude glob:
"nohoist": [
"**/!(my-site|my-cms|someones-components)"
]
See my answer to another question for more information.
I am trying to set up my symfony 2.8 app for local development.(Following - https://symfony.com/doc/current/deployment/heroku.html)
Added In proc file
web: bin/heroku-php-apache2 web/
Error
bin/sh: vendor/bin/heroku-php-apache2: No such file or directory
Also note , composer.phar config bin-dir is bin
Anyone who can share how they resolved this problem?
First of all, have you tried letting heroku create the Procfile itself? I think lately it was smart enough to work out the root of the Symfony project.
If that doesn't work, maybe that's not the right path, try:
echo 'web: $(composer config bin-dir)/heroku-php-apache2 web/' > Procfile
If none of those work, I'd rather use the heroku information on how to deploy your Symfony app, have a look at this and see if it helps:
https://devcenter.heroku.com/articles/getting-started-with-symfony
I am trying to place an updated jar under lib path and removing the old jar. Unfortunately , I see the old logs in oozie console which were present in old jar. For confidential purpose I am unable to show logs here. But I am doing the below steps:
Replacing a jar (mycode.jar) under lib folder which is mentioned in workkflow.xml
Submitted the oozie job using oozie job -oozie http://host -config job.properties -run
When I see logs in console, I could see old jar(older version of mycode.jar) logs even if jar is replaced.
If you are talking about the lib directory in the oozie workflow application then you need not to do anything. The next execution of the workflow will automatically pick the new (updated) jar.
For updating the jars into share lib /user/oozie/share/lib/lib_*/* then after replacing the jar, you need to execute the following command to update the share lib into oozie server.
oozie admin -sharelibupdate
Hope this will help. Thanks.
To make sure issue is same I'll narrate what I was facing:
created a MapReduce JAR and placed it in lib folder.
Ran oozie(MapReduce action) job and picked the JAR as expected and ran fine.
I had some functionality changes in my code(JAR) so I added new log statements to make sure new JAR is being picked. Built the JAR and replaced the old JAR with newly built JAR in lib folder(hdfs)
Ran oozie job again, code from old JAR was executed because new log statements did not show up.
After few search I found following tips:
Clear the Yarn Cache: found this in HortonWorks site(https://community.hortonworks.com/articles/92339/how-to-clear-local-file-cache-and-user-cache-for-y.html) - pasting content below for reference
Short Description:
To use different version jar file with same name, clear cache on all NodeManager hosts to prevent the application using old jar
a. Find out the cache location by checking the value of the yarn.nodemanager.local-dirs property
< property >
< name >yarn.nodemanager.local-dirs< /name>
< value>/hadoop/yarn/local</value>
< /property>
b. Remove filecache and usercache folder located inside the folders that is specified in yarn.nodemanager.local-dirs.
[yarn#node2 ~]$ cd /hadoop/yarn/local/
[yarn#node2 local]$ ls filecache nmPrivate spark_shuffle usercache
[yarn#node2 local]$ rm -rf filecache/ usercache/
c. Restart YARN service.
I was unable to clear cache because I did not have the necessary access. Thus I followed below workaround
Rename the Package or class, since this package/class was written by me, I had the liberty to simply rename the class, thus in oozie when new Class name was looked up, automatically the new functionality was executed.
Option 2 may not be viable for many and the question remains open as to why oozie does not pick New JAR/Class.
I am working on Symfony project.
When i try to do :
php app/console cache:clear
i get the following ErrorException:
Warning : rename (../app/cache/dev , ../app/cache/dev_old ) : Access Denied . (Code : 5) in ../vendors/Symfony/src/Symfony/Bundle/FrameworkBundle/Command/CacheClearCommand.php on line 76
What is the problem here? I have given all permissions to the user on my machine (Windows 7 OS). Any ideas why it is happening?
Thank You.
Be sure that the files are not in use (as meze pointed out). If you're using something like TortoiseGit or Netbeans, etc - be sure to mark the cache folder as ignored so that they are not accessed.
If all else fails, download a free program like Unlocker that will allow you to quickly and easily detach running processes from the files/folders you are trying to modify.
To expand on leek's post, Symfony 2 cache-clearing operations shuffle the cached items across different folders during the cleanup. Part of this process includes creating cache/dev-new/ and cache/dev-old/ folders.
If you are using Eclipse or another IDE that dynamically monitors subfolders within your project, the IDE will nearly instantly spot the new folder creation and look in those folders for new files (in Eclipse, I noticed the DLTK module constantly doing this in the Progress View). This may unfortunately get in the way of Symfony, which wants to rename and/or delete these folders.
Specifically with Eclipse Indigo on Windows 7 64-bit, you can remove the cache/, cache/dev/, cache/dev_old/ and cache/dev_new/ folders from the build path by right-clicking your project and selecting "Build Path > Configure Build Path...". This originally had no effect for me; I kept seeing the DLTK module trying to index the cache folders. I ended up uninstalling the Aptana Studio plug-in, closing all Editor documents, shutting down Eclipse, manually deleting the sub-folders in the cache/ folder, running Symfony cache:clear, then starting up Eclipse and reinstalling Aptana. Seems to have worked thus far.
It's an issue with Symfony 2.0.x and Symfony 2.1.x. These a workaround for this:
Open the file: src\Symfony\Bundle\FrameworkBundle\Command\CacheClearCommand.php
and add the statement sleep(1); where the directory creation is failing, in the execute() function:
//...
rename($realCacheDir, $oldCacheDir);
sleep(1);
rename($warmupDir, $realCacheDir);
//...
You might have to re-open the CLI twice and run cache:clear, but it will fix the problem after that
If you're using text editors as Sublime Text try to ignore path of cached files
Go to Preferences/Setting
edit config file
{
...
"folder_exclude_patterns": ["var","node_modules", ".git"],
...
}
in my case
Cache folders containers.
Symfony 4
/var
Symfony 2
/app/cache
Nice coding!
I had the same issue with Symfony 4.1.13 and the root cause was that the VS Code was using the file "var/cache/dev/srcDevDebugProjectContainer.xml4QSKuA".
The issue was Cannot rename "var/cache/dev/srcDevDebugProjectContainer.xml4QSKuA".
I fix it adding **/var on the fields "Files to Exclude"
See the following steps:
1º Step:
2º Step search for "exclude" word, then add **/var/cache as the following image:
It's worked for me.
To have tranquility once and for all I create clear.bat file in MyProject folder under Windows 7 .
rmdir d:\symfony\framework-standard-edition\app\cache\ /S /Q
I guarantee 100% effectiveness for cleaning cache. Some minor problems can occurred after first time refresh website: Symfony must recreate some folders they need.
SYMFONY AND ANTIVIRUS
Multiple symfony calling in local windows machine get cross with some Antivirus software such as NOD in my case. Exclude symfony cache folder from real time protection