Expand Column List in OpenEdge datagrip - openedge

I have a connection to an OpenEdge database using Datagrip. The intelisense seems very minimal. At a minimum I would like to be able to expand the column list from the star. I know it can work because I have set up other systems that use OpenEdge and JetBrains. It there some setting or something I am missing to accomplish this?

Related

How to determine the largest length of Progress OpenEdge ABL fields

In OpenEdge ABL / Progress 4GL, a field can be defined with a FORMAT, but that is only the default format for it to be displayed. Thus, a CHARACTER field with FORMAT 'X(10)' could store thousands of characters past the first ten.
The database I'm using contains millions of rows in some of the tables I'm concerned with. Is there any system table or Progress-internal program I can use to determine the longest length of a given field? I'm looking for anything more efficient than full-table scans. I'm on Progress OpenEdge 11.5.
"dbtool" will scan the db and find fields whose width exceeds the "sql width". By default that is 2x the format that was defined for character fields.
https://knowledgebase.progress.com/articles/Article/P24496/
Of course it has to scan the table to do that so it may not meet your "more efficient than table scans" criteria. FWIW dbtool is reasonably efficient.
If the fields that you are concerned about are problematic because of potential SQL access you might also want to look into "authorized data truncation" via the -SQLTruncateTooLarge parameter which will truncate the data on the fly.
Another option would be -SQLWidthUpdate which automatically adjusts the SQL width on the fly. That requires an upgrade to at least 11.6.
Both of these might solve your problem without periodic table scans.
If it's actually the character format you want to adjust to match the data, I suppose what you could do is to use dbtool to adjust the SQL width of all the fields, and then set the character format to be half the SQL width.

Is it possible to change a column without run raw sql in DbVisualizer + SQLite

Working on a SQLite database, seems DbVisualizer Pro does a lot of work very well, except one,
Changing table schema.
I often need to change column name, data type, etc, but don't want to do it through raw SQL statement. My workaround is opening Firefox's SQLite Manager to just change the schema.
Is it possible to use DbVisualizer to change the schema? many thanks!
Edit:
Alter table action mentioned below by roger, seems to be the right way to go. But somehow I can only add column, the existing column appears to be read only.
Mine is DBVisualizer Pro Evaluation. Is non-Evaluation different?
Edit2:
Using SQLite Manager is sometimes dangerous, as warned below. just learned, renaming a column may cause the foreign key loss. but workaround is here
In DbVisualizer Pro there is the Alter Table action (and Create Table for creating new tables). Select the actual table you want to change in the Databases tab, right-click and chose Alter Table. In order for this to work you need DbVisualizer Pro and the Database Type for your connection must be set to either Auto Detect (recommended) or SQLite.

Best way to get rows changed in an sqlite db

I'm watching an sqlite db which an app uses.
I want to know what changes have been made since
I last checked.
I can dump to sql and diff against the last dump,
but it seems there should be a better way.
Is there?
Thanks,
Kent
PS Not to be coy, specifics: I'm managing photos with Shotwell, which has a great GUI.
I'm mirroring Shotwell's db in Postgresql, where I've restructured and augmented to my liking. After a Shotwell session, which involves adding, tagging, adjusting ... I want
to apply those changes to Postgres.
Add a field named _changed to your table(s). On every manipulation (update, insert into...) of a row set the field to the current timestamp. Now you can check which rows have been updated since.

How to reset your database in visual studio 2010

I want to wipe out all the data in the rows in the tables that I have, how do i do it?.. I want to completely delete them. I have 4 tables, and i prefer to delete/ reset them altogether..
``
This article may help you. It uses the built in sp_MSForEachTable to check/remove constraints and then truncate the data
SQL Table?
try truncate table [tablename] - this should delete and reset ids
Tim's solution is good, but just make sure that you are aware of any dependencies in your table.
start by deleting the farthest child table in the relation, and go up one level till you reach to a table with only foreign keys to other tables.
and it's always better to keep it as a SQL script that you may run whenever you need to do the reset.

Use LINQ to insert data from dataset to SQL

Let's say I have a dataset in an ASP.NET website (.NET 3.5) with 5 tables, each has roughly 30,000 rows and an average of 12 columns. I want to insert all of the data from the dataset into 5 very-similar-but-not-quite-identical tables in SQL Server 2008. I also want to use LINQ (personal preference - trying to learn something new).
Is it as simple as iterating through the dataset and, for each row, creating a new instance of the associated class, initializing its data with the dataset's row, adding it to the data model, and then doing one giant SubmitChanges at the end?
Are there better ways of doing this with LINQ? Or is this the de-facto standard?
Creating objects and inserting them is fine. But to avoid a gigantic commit at the end, you might want to perform a SubmitChanges() every 100 rows or so.
Alternately you could get a copy of Red Gate's "SQL Data Compare" utility if you have the cash. Then you never have to write one of these things again. :-)
Edit 2010-04-19: If you want to use a transaction, I think you should still use my approach instead of a single SubmitChanges(). In this case you'll want to explicitly manage your own transaction in L2S (see http://msdn.microsoft.com/en-us/library/bb386995.aspx). Run your queries in a try/catch and roll back the transaction if you get any failures.
Two last bits of advice:
Make sure your ASP.NET timeout is set high enough.
Consider printing out some kind of progress indicator. It makes running these kind of long-running things much more palatable.
Linq To Sql doesn't natively have anything like the SqlBulkCopy class. I did a quick search and it looks like there's an implementation for Linq To Sql. No clue if it is any good but it can't hurt to check it out.
DataContext.ExecuteCommand can be used with an arbitrary SQL statement. You could do a "INSERT FROM".

Resources