Is there a way to check the syntax of a SQLite3 script without running it?
Basically, I'm looking for the SQLite3 equivalent of ruby -c script.rb, perl -c script.pl, php --syntax-check script.php, etc.
I've thought of using explain, but most of the scripts I'd like to check are kept around for reference purposes (and don't necessarily have an associated database). Using explain would also make it hard to use with something like Syntastic. (That is, I'm only wanting to check syntax, not semantics.)
Update:
I'm confused. Let's say I want to syntax check this:
select * from foo;
I could do something like:
echo 'explain select * from foo;' | sqlite3
But then I get:
SQL error near line 1: no such table: foo
All I'd expect to see is "Syntax OK" (or something similar). Is that possible?
A syntax checker that works well for me so far is the Ruby gem sqlint
It checks ANSI SQL: it is not specific to the sqlite dialect.
It came out in 2015, 5 years after the question was asked
It needs Ruby and a C compiler (to build the pg_query native extension.)
Here is an example of output:
$ sqlint ex7a.sql
ex7a.sql:81:10:ERROR syntax error at or near "*"
In the true Unix tradition, if the syntax is correct, sqlint produces no output. As the questioner asked, it doesn't check if tables exist.
As you mentioned, you can use the EXPLAIN keyword. It will return information about how the statement would be executed had you omitted the preceding EXPLAIN keyword.
You could probably use EXPLAIN against an in memory database. With sqlite3, you can get an in memory database by passing ":memory:" as the filename to sqlite3_open_v2().
Related
I've found few Stack Overflow questions talking about this, but they are all regarding only the :nmap or :noremap commands.
I want a command, not just a keybinding. Is there any way to accomplish this?
Use-case:
When I run :make, I doesn't saves automatically. So I'd like to combine :make and :w. I'd like to create a command :Compile/:C or :Wmake to achieve this.
The general information about concatenating Ex command via | can be found at :help cmdline-lines.
You can apply this for interactive commands, in mappings, and in custom commands as well.
Note that you only need to use the special <bar> in mappings (to avoid to prematurely conclude the mapping definition and execute the remainder immediately, a frequent beginner's mistake: :nnoremap <F1> :write | echo "This causes an error during Vim startup!"<CR>). For custom commands, you can just write |, but keep in mind which commands see this as their argument themselves.
:help line-continuation will help with overly long command definitions. Moving multiple commands into a separate :help :function can help, too (but note that this subtly changes the error handling).
arguments
If you want to pass custom command-line arguments, you can add -nargs=* to your :command definition and then specify the insertion point on the right-hand side via <args>. For example, to allow commands to your :write command, you could use
:command -nargs=* C w <args> | silent make | redraw!
You can combine commands with |, see help for :bar:
command! C update | silent make | redraw!
However, there is a cleaner way to achieve what you want.
Just enable the 'autowrite' option to automatically write
modified files before a :make:
'autowrite' 'aw' 'noautowrite' 'noaw'
'autowrite' 'aw' boolean (default off)
global
Write the contents of the file, if it has been modified, on each
:next, :rewind, :last, :first, :previous, :stop, :suspend, :tag, :!,
:make, CTRL-] and CTRL-^ command; and when a :buffer, CTRL-O, CTRL-I,
'{A-Z0-9}, or `{A-Z0-9} command takes one to another file.
Note that for some commands the 'autowrite' option is not used, see
'autowriteall' for that.
This option is mentioned in the help for :make.
I have found a solution after a bit of trial and error.
Solution for my usecase
command C w <bar> silent make <bar> redraw!
This is for compiling using make and it prints output only if there is nonzero output.
General solution
command COMMAND_NAME COMMAND_TO_RUN
Where COMMAND_TO_RUN can be constructed using more than one command using the following construct.
COMMAND_1_THAN_2 = COMMAND_1 <bar> COMMAND_2
You can use this multiple times and It is very similar to pipes in shell.
I am discovering writing functions in TCL for Sqlite (https://github.com/pawelsalawa/sqlitestudio/wiki/ScriptingTcl).
I wanted to play a basic exemple found in the official page of sqlite(http://sqlite.org/tclsqlite.html):
db eval {SELECT * FROM MyTable ORDER BY MyID} values {
parray values
puts ""
}
I get the following error:
Error while requesting the database « -- » : invalid command name "parray"
Help is very welcome :)
SqliteStudio does not seem to fully initialise Tcl, as you would expect it from a non-embedded installation:
Using external Tcl packages or modules is not possible, because Tcl
interpreters are not initialized with "init.tcl".
See Wiki.
Background
Standard Tcl sources init.tcl, early as part of an Tcl interpreter's initialisation. init.tcl, in turn, registers a number of Tcl procs for autoloading. parray is one of those lazily acquired procs.
Ways forward
I am not familiar with SqliteStudio. Why not stick with sqlite's standard Tcl frontend, which gives you full Tcl and comes with Tcl distributions free house? But this certainly depends on your requirements.
That said, you could attempt to force-load init.tcl in SqliteStudio's embedded Tcl, but I don't know (and can't test) whether the distribution has not pruned these scripts or whether they were effectively relocated. From the top of my head (untested):
source [file join $tcl_library init.tcl]
# ...
db eval {SELECT * FROM MyTable ORDER BY MyID} values {
parray values
puts ""
}
I am working on a PHP/Javascript project where I've nicely set up a build workflow. It involves testing, minifying, compressing into the final zip deliverable, and a whole lot of other nice stuff.
I want to build a task that fails when there are certain patterns in the source code. I would like to look for any print_r(), error_log(), var_dump(), etc functions, and halt the build process if there are any. Perhaps later I would like to check for things in Javascript or CSS so this is not only a PHP question.
I know it can be done with grunt-shell and grep but I'd like to know the following:
Are there any grunt plugins specific to this task? Ideally I would like to be able to specify a list of regexes per file type, and to set whether to continue or fail the build on pattern match.
How do others tackle the problem of double-checking the packaged source for the most common debug statements or other patterns?
Not a complete answer to my question, but I've recently come across this grunt plugin which is somewhat related. It removes console.log statements from JavaScript. Haven't tried it yet. Looks good. I still would like to know if there's something similar for PHP though.
http://grunt-tasks.com/grunt-remove-logging-calls/
Edit: Seeing as there's only tumbleweeds rolling in the wind here, I'm posting my workaround that's based on grunt-shell. However this is not what I was looking for. It's not perfect because it doesn't do proper syntax parsing:
shell: {
check_debug_prints: {
command: '(! (egrep -r "var_dump|print_r|error_log" --include=*.php src || egrep -r "console\.\w+|debugger;" --include=*.js src) ) || (echo "Debug prints in source - build aborted" && false )'
}
},
and
grunt.loadNpmTasks( 'grunt-shell' );
Edit 2: I finally found the exact grunt plugin I was looking for. It is grunt-search. There is a failOnMatch boolean option that lets you indicate if a particular regex pattern should cause the build to fail when found.
All, I am running the following script to load the data on to the Oracle Server using unix box and sqlldr. Earlier it gave me an error saying sqlldr: command not found. I added "SQLPLUS < EOF", it still gives me an error for unexpected end of file syntax error on line 12 but it is only 11 line of code. What seems to be the problem according to you.
#!/bin/bash
FILES='ls *.txt'
CTL='/blah/blah1/blah2/name/filename.ctl'
for f in $FILES
do
cat $CTL | sed "s/:FILE/$f/g" >$f.ctl
sqlplus ID/'PASSWORD'#SERVERNAME << EOF sqlldr SCHEMA_NAME/SCHEMA_PASSWORD control=$f.ctl data=$f EOF
done
sqlplus will never know what to do with the command sqlldr. They are two complementary cmd-line utilities for interfacing with Oracle DB.
Note NO sqlplus or EOF etc required to load data into a schema:
#!/bin/bash
#you dont want this FILES='ls *.txt'
CTL_PATH=/blah/blah1/blah2/name/'
CTL_FILE="$CTL_PATH/filename.ctl"
SCHEMA_NM=SCHEMA_NAME
SCHEMA_PSWD=SCHEMA_PASSWORD
for f in *.txt
do
# don't need cat! cat $CTL | sed "s/:FILE/$f/g" >"$f".ctl
sed "s/:FILE/$f/g" "$CTL_FILE" > "$CTL_PATH/$f.ctl"
#myBad sqlldr "$SCHEMA_NAME/$SCHEMA_PASSWORD" control="$CTL_PATH/$f.ctl" data="$f"
sqlldr $SCHEMA_USER/$SCHEMA_PASSWORD#$SERVER_NAME control="$CTL_PATH/$f.ctl" data="$f" rows=10000 direct=true errors=999
done
Without getting too philosophical, using assignments like FILES=$(ls *.txt) is a bad habit to get into. By contrast, for f in *.txt will deal correctly for files with odd characters in them (like spaces or other syntax breaking values). BUT the other habit you do want to get into is to quote all variable references (like $f), with dbl-quotes : "$f", OK? ;-) This is the otherside of protection for files with spaces etc embedded in them.
In the edit update, I've varibalized your CTL_PATH and CTL_FILE. I think I understand your intent, that you have 1 std CTL_FILE that you pass thru sed to create a table specific .ctl file (a good approach in my experience). Note that you don't need to use cat to send a file to sed, but your use to create a altered file via redirection (> $f.ctl) is very shell-like too.
In 2nd edit update, I looked here on S.O. and found an example sqlldr cmdline that has the correct syntax and have modified to work with your variable names.
To finish up,
A. Are you sure the Oracle Client package is installed on the machine
that you are running your script on?
B. Is the /path/to/oracle/client/tools/bin included in your working
$PATH?
C. try which sqlldr. If you don't get anything, either its not
installed or its not in the path.
D. If not installed, you'll have to get it installed.
E. Once installed, note the directory that contains the sqlldr cmd.
find / -name 'sqlldr*' will take a long time to run, but it will
print out the path you want to use.
F. Take the "path" part of what is returned (like
/opt/oracle/11.2/client/bin/ (but not the sqlldr at the end), and
edit script at 2nd line with
(Txt added to appease the S.O. Formatter ;-) )
export ORCL_PATH="/path/you/found/to/oracle/client"
export PATH="$ORCL_PATH:$PATH"
These steps should solve any remaining issues. If this doesn't work, see if there is someone where you work that understands your local computing environment that can help explain any missing or different steps.
IHTH
Minor problem, nevertheless irritating : Is there a way to avoid the following message from appearing each time I make a query :
-- Loading resources from /Users/ThG/.sqliterc
As a stupid workaround, this works:
<. sqlite your_sqlite.db 'select * from your_table'
This is because the current code does this:
if( stdin_is_interactive ){
utf8_printf(stderr,"-- Loading resources from %s\n",sqliterc);
}
Forcing a stdin redirect thwarts this due to this piece of code:
stdin_is_interactive = isatty(0);
This works as well:
sqlite -batch your_sqlite.db 'select * from your_table'
due to this piece of code:
}else if( strcmp(z,"-batch")==0 ){
/* Need to check for batch mode here to so we can avoid printing
** informational messages (like from process_sqliterc) before
** we do the actual processing of arguments later in a second pass.
*/
stdin_is_interactive = 0;
}
but it's longer, so kind of defeats the purpose.
I know that this question is PRETTY old now, but simply deleting '/Users/ThG/.sqliterc' should solve the problem. '.sqliterc' is a configuration file for sqlite's interactive command line front-end. If you don't spend a lot of time in there, you won't miss the file.
That resource msg comes out on stderr, and it's followed by a blank line, so you could get rid of it with something like this (wrapped up in a script file of its own):
#!/bin/bash
sqlite3 -init /your/init/file /your/sqlite3/file.db "
your
SQL
cmds
" 2>/dev/null | sed -e1d
When using sqlite in shell scripts, you usually don't even want your ~/.sqliterc to be loaded at all. This works well for me:
sqlite3 -init <(echo)
Explanation:
-init specifies the file to load instead of ~/.sqliterc.
<(echo) uses Process Substitution to provide a path to a temporary empty file.
A bit late but #levant pied almost had the solution, you need to pass an additional -interactive to silence the --loading resources from.
$ sqlite3 -batch -interactive
SQLite version 3.31.1 2020-01-27 19:55:54
Enter ".help" for usage hints.
sqlite> .q
You can simply rename your config file to disable the warning. And revert the rename to keep the configuration after use.
I use the following:
#suppress default configuration warnings
mv $HOME/.sqliterc $HOME/.backup.sqliterc
# sqlite3 scripts...
#revert
mv $HOME/.backup.sqliterc $HOME/.sqliterc