Testing on Virtualbox with laravel dusk and using as test database sqlite I get
unlink(/var/www/laravel/database/database.sqlite): Text file busy
after researches I understand this problem is mainly because of Virtualbox shared folder.
I was trying to call the sqlite file from /tmp folder but in this case I get
Database (/tmp/database.sqlite) does not exists
How would I manage this issue to have my tests running
First of all, you should manually create file:
touch /tmp/database.sqlite
It's true, that using Virtualbox or Docker there might be issues with file sharing. I had Text file busy error many times. Usually to solve this you need to restart your VM to get rid of this error.
Related
First of all, I realize similar questions have been asked but none of them seem to have the same problem and I can't find a solution.
I can create tables and do write/read operations perfectly well within python accessing my SQLlite database. However, when trying to access the database through dbeaver I get the following issues:
First, when trying to connect to the db file, it asks me "A file named database.db already exists. Do you want to replace it?"
When trying to look at the tables via GUI it loads for a couple of seconds before showing an error
I have not found a way to solve this issue. Has anyone experience with this and a solution?
EDIT: I want to add what sqllite has to say about the given error: https://www.sqlite.org/rescode.html#busy
It states that the error occurs "because of concurrent activity by some other database connection". I don't know where this concurrent activity would come form though, as I'm closing everything and I'm just trying to look at the tables in the GUI. I think the issue has something to do with the first issue where it asks me if I want to replace the file.
Based on the previous comments, uninstall DBeaver snap
snap remove dbeaver-ce
and install using the .deb package from the official site
wget https://dbeaver.io/files/dbeaver-ce_latest_amd64.deb
sudo apt install ./dbeaver-ce_latest_amd64.deb
this works for me.
all the credits to the previous comments =)
TLTR: If your database file is located in a mountable filesystem, you need to give dbeaver permission to read files from a mountable filesystem.
I have found 2 ways to solve this issue on ubuntu:
1: Make sure your database file is in home directory. Since dbeaver has permission to access your home directory, it will work.
OR
2: If you have downloaded dbeaver from:
ubuntu software center directly or
from the terminal using snap install
and your database file is located in a mountable filesystem, heard over to ubuntu software center => installed, find dbeaver in the list then click on it, on the next window top left, click on the permissions and toggle Read system mount information and disk quotas, put in your password in the authentication prompt and you're good to go.
I work on a Symfony project using Vagrant. The host machine is using Windows. Due to fact that the request time is very high, I decided to install the vendor files inside the vm and the entire "rest" of the project remains inside the synced folder (project root => /vagrant).
Everything is working fine and the request time is under 100ms now. But there is one issue left. I have to install the vendor on my Windows machine first and then again in the vm, otherwise PhpStorm is not able to index the files correctly (I know, this is a logical consequence).
So my question is, if it is possible, to host a project on the Windows machine and the files are for example under "C:\Users\SampleUser\Project\ProjectX" and the vendor is installed under "/home/vagrant/vendor" and let PhpStorm index the files of both directories?
Otherwise I will have to live with this one and code completion won't work.
Or I will have to install the libraries on both machines to improve request time and have a more or less "good" workflow.
I hope, I could explain good enough, what my actual problem is.
Thank you very much for your time.
Had the same exact problem. Indeed a bummer.
One possible solution is to leave the vendor folder on the VM and manually copy it to your host machine.
Pros:
PHPStorm is able to index files
Cons:
If you add a dependency, you have to copy some parts of the vendor folder manually to the host machine
To those facing the same problem, I might advise SFTP (Tools -> Deployment -> Configuration in PHPStorm) - files can be transferred without leaving the IDE window. The only thing to do is get the VM box password, which is located at
%USERNAME%/.vagrant.d/boxes/your box/box version/virtualbox/Vagrantfile
Second solution: if you are using Virtualbox, you can use vm.synced_folder with type: "virtualbox" (the sync works both ways, host<->guest), and leave the vendor folder in your project (for it to sync all the time).
Pros:
vendor folder always up to date, no manual work
Cons:
Horrible performance (tested myself)
If you want to use non-virtualbox rsync (type: "rsync"), you will not get the ability to sync back from the guest (someone, please correct me if I'm wrong!), so you are left with the 1st solution.
It would be great if we could include the vendor folder directly from the VM (using some kind of rsync/symlink magic) to the "Languages & Frameworks -> PHP -> include path" list, at least when using VirtualBox, but oh well...
Hello I'm new to MVC and have been looking for a solution to my problem for the last couple days with no avail.
I've created a simple blog engine using ASP.NET MVC, after installing it on IIS on my local PC, I quickly realized I needed a database for the login service to work.
So I elected to use LocalDB on a PC that I plan to use as a server. I moved my files over to the PC and installed everything I needed. As soon as I installed SQLExpress with LocalDB and reset the site, everything was working perfectly. However, I noticed some minor typos on a section of the site that's not easily edited. Stupidly, I reinstalled the website entirely from a new build instead of just updating the view that needed correction like a smart person would do.
Now every time I attempt to login to an excising account or create a new one I simply get the error
Cannot attach the file 'C:\inetpub\wwwroot\App_Data\aspnet-FacetBlog-20161020065352.mdf' as database 'aspnet-FacetBlog-20161020065352'.
From what I've learned, It's something to do with my LocalDB instance, but fixes I've found online seem to have no effect.
Admittingly, I'm pretty naive with it comes to SQL, so hopefully the fix is simple. If I've failed to provide vital information please tell me and I'll update the question. Additionally, an explanation of what exactly went wrong would be much appreciated. Thank you for your time.
When reinstalling your site, probably you had deleted database file aspnet-FacetBlog-20161020065352.mdf in your database directory. Since deleting MDF file doesn't affect registered SQL Express instance, SQL Express thinks the database still exists and trying to connect but failed.
First, try to attach the DB using SSMS query like this:
EXEC sp_attach_db #dbname=N'aspnet-FacetBlog-20161020065352.mdf', #filename1=N'App_Data\aspnet-FacetBlog-20161020065352.mdf', #filename2=N'App_Data\aspnet-FacetBlog-20161020065352.ldf'
NB: I assumed your MDF file location stands in App_Data directory, change it to your actual database directory.
If attempt to attach by query doesn't work, try these steps:
Stop IIS/IIS Express pool if it still running.
Open a Windows Powershell instance on your server PC (install it first if doesn't exist).
Run the following command:
sqllocaldb.exe stop v11.0
sqllocaldb.exe delete v11.0
sqllocaldb.exe start v11.0
Recreate the DB instance on your project, including data connection in Server Explorer (remove then create).
Re-run the project in development machine, then copy all required files to server PC. The database instance should be (re-)generated at this time.
Restart IIS/IIS Express pool on server PC.
Additionally you may run Update-Database command from Package Manager Console to ensure database integrity.
If those solutions still won't work altogether, simply rename your database files and attach it on your project, verify the connection string then retry step (5) and (6) above.
Related problems:
EF5: Cannot attach the file ‘{0}' as database '{1}'
Cannot attach the file ".mdf" as database "aspnet-"
Delete .mdf file from app_data causes exception cannot attach the file as database
Cannot attach the file 'C:\inetpub\wwwroot\MvcApplication10\App_Data\aspnet-MvcApplication10-20130909002323.mdf' as database 'aspnet-MvcApplication10-20130909002323'
I have a fairly simple flask app which runs fine on my local machine. The app uses sqlite3. I am trying to deploy to a CentOS machine running nginx and uwsgi. The app starts but when I try to access the site through chrome, it raises an exception:
sqlite3.OperationalError: unable to open database file
I believe I have all the permissions correct, the user starting the app has ownership of the database file. All the directories have 777 permissions. The database has 665 permissions. nginx is started using sudo.
I have combed through all the existing posts about this kind of thing. People talk about permissions, but I am pretty sure I have those correct. The name of the file is correct.
DATABASE = 'sqlite:////home/.../firstDB.db'
I get the same error if the database points to a nonexistent file. What else could be going wrong?
So it turns out that the file name prefix of sqlite/// is incorrect. I don't understand this, as it worked before. I just put the file name only and it works now.
I've setup a Vagrant box that runs my webserver to host my Symfony2 application.
Everything works fine except the folder synchronization.
I tried 2 things:
config.vm.synced_folder LOCALFOLDER, HOSTFOLDER
config.vm.synced_folder LOCALFOLDER, HOSTFOLDER, type="rsync"
Option 1: First option works, I actually don't know how file is shared but it works.
Files are copied in both way, but the application is SUPER slow.
Symfony is generating cache files which might be the issue, but I don't really know how to troubleshoot this and see what is happening.
Option 2: Sync is only done in one way (from my local machine to the vagrant box), which covers most of the case and is fast.
Issue is that when I use symfony command line on the vagrant box to generate some files they are not copied over to my local machine.
My question is:
What is the best way to proceed with 2 ways syncing? With option 1 how can I (as it might be the issue) exclude some files from syncing.
With Option 2 how can I make sure changes on remote are copied to my local machine?
If default synced folder strategy (VirtualBox shared folders, I imagine) is slow for your use case, you can choose a different one and, if you need, maintain the two-way sync:
If your host OS is Linux or Mac OS X, you can go with NFS.
If your host OS is Windows you can instead choose SMB.
Rsync is very fast but, as you've pointed out, is one-way only.
As it doesn't seem Vagrant offers a "built-in" way to do this here is what I did:
Configure Vagrant RSYNC folder on the folders that will contains application generated files (in Symfony2 it is your Bundle/Entity folder). Note that I didn't sync the root folder because some folders doesn't have to be rsynced (cache/logs...) and also because it was taking way too much time for the rsync process to parse all the folders/subfolders when I know that only the Entity folder will be generated.
As the Rsync has to be done from the Vagrant box to the host, I use vagrant-rsync-back plugin and thus run this manually everytime I use a command that generates code.
https://github.com/smerrill/vagrant-rsync-back#getting-started
Create an watcher on my local machine that will track any change in code and rsync it to the vagrant box.
https://gist.github.com/laurentlemaire/e423b4994c7452cddbd2
Vagrant mounts your project root as /vargrant folder inside box as 2 way share.
You can run your command there do get required files synced. Any I/O will be damn slow (like you already mentioned), however you will get your files. For other stuff use your 1-way synced folder.