I have a complex query that includes some dynamic sql which partially depends upon a checkboxlist. Here's the part that has me stumped right now (brain fart?).
Simple example:
Table A (id, name)
Table B (id, Aid, Cid)
Table C (id, color)
So lets say Table A has:
1, Bob
2, Tim
3, Pete
and Table C has:
1, Red
2, Blue
3, Green
Now Table B has
1, 1, 1
2, 1, 2
3, 3, 2
So that Bob's favorite colors are Red and Blue and Pete's favorite colors are only Blue.
How do I query so that I only retrieve rows from Table A that have favorite colors of both Red and Blue. I don't want to see Pete in my resultset.
You want to use the INTERSECT operator to get those that match both, this is SQL 2005+ only, however.
SELECT name FROM TableA
WHERE ID IN (SELECT Aid FROM TableB WHERE CId = 1
INTERSECT
SELECT Aid FROM TableB WHERE CId = 2)
SELECT sr.receiving_id, sc.collection_id FROM stock_collection as sc, stock_requisation as srq, stock_receiving as sr WHERE (sc.stock_id = '" & strStockID & "' AND sc.datemm_issued = '" & strMM & "' AND sc.qty_issued >= 0 AND sc.collection_id = srq.requisition_id AND srq.active_status = 'Active') OR (sr.stock_id = '" & strStockID & "' AND sr.datemm_received = '" & strMM & "' AND sr.qty_received >= 0)
Related
I am trying to write a Power BI query that can calculate the number of ROWS with a Condition.
Now I have 5 tables- Table1, Table 2, Table 3, Table 4, Table 5.
Now in those Tables, I have two Column Called ID & Date. I would like to Count all ID where the Date is `not Empty
I am trying this Query but it is not helping my cause.
All Total Hires =
SUMX(
UNION(
SELECTCOLUMNS(Table1,"A",Table1[Name]),
SELECTCOLUMNS(Table2,"A",Table2[Name]),
SELECTCOLUMNS(Table3,"A",Table3[Name]),
SELECTCOLUMNS(Table4,"A",Table4[Name]),
SELECTCOLUMNS(Table5,"A",Table5[Name])
)
,IF([A] <> NULL, 1, 0))
Does Anyone know any solution to this Problem?
You can do something like this using COUNTROWS and BLANK(). Note: I've assumed that the Date is null/blank and not ' ' type of empty.
Table1 Non Blanks= CALCULATE(COUNTROWS('Table1'), FILTER('Table1', 'Table1'[Date] <> BLANK())
You create a measure per table, and add them together or
CALCULATE(COUNTROWS('Table1'), FILTER('Table1', 'Table1'[Date] <> BLANK())
+ CALCULATE(COUNTROWS('Table2'), FILTER('Table2', 'Table2'[Date] <> BLANK())
+ and Table3 etc
You can also try this below measure-
count_id =
COUNTROWS(
UNION(
FILTER(Table_1, Table_1[date] <> BLANK()),
FILTER(Table_2, Table_2[date] <> BLANK())
)
)
I have some data that looks like this:
UserID Category
------ --------
1 a
1 b
2 c
3 b
3 a
3 c
A I'd like to binary-encode this grouped by UserID: three different values exist in Category, so a binary encoding would be something like:
UserID encoding
------ --------
1 "1, 1, 0"
2 "0, 0, 1"
3 "1, 1, 1"
i.e., all three values are present for UserID = 3, so the corresponding vector is "1, 1, 1".
Is there a way to do this without doing a bunch of CASE WHEN statements? There may be dozens of possible values in Category
Cross join the distinct users to distinct categories and left join to the table.
Then use GROUP_CONCAT() window function which supports an ORDER BY clause, to collect the 0s and 1s:
WITH
users AS (SELECT DISTINCT UserID FROM tablename),
categories AS (
SELECT DISTINCT Category, DENSE_RANK() OVER (ORDER BY Category) rn
FROM tablename
),
cte AS (
SELECT u.UserID, c.rn,
'"' || GROUP_CONCAT(t.UserID IS NOT NULL)
OVER (PARTITION BY u.UserID ORDER BY c.rn) || '"' encoding
FROM users u CROSS JOIN categories c
LEFT JOIN tablename t
ON t.UserID = u.UserID AND t.Category = c.Category
)
SELECT DISTINCT userID,
FIRST_VALUE(encoding) OVER (PARTITION BY UserID ORDER BY rn DESC) encoding
FROM cte
ORDER BY userID
This will work for any number of categories.
See the demo.
Results:
UserID
encoding
1
"1,1,0"
2
"0,0,1"
3
"1,1,1"
First create an encoding table to explicit establish order of categories in the bitmap:
create table e (Category int, Encoding int);
insert into e values ('a', 1), ('b', 2), ('c', 4);
First generate a list of users u (cross) joined with the encoding table e to get a fully populated (UserId, Category, Encoding) table. Then left join the fully populated table with the user supplied data t. The right hand side t can now be used to drive if we need to set a bit or not:
select
u.UserId,
'"' ||
group_concat(case when t.UserId is null then 0 else 1 end, ', ')
|| '"' 'encoding'
from
(select distinct UserID from t) u
join e
left natural join t
group by 1
order by e.Encoding
and it gives the expected result:
1|"1, 1, 0"
2|"0, 0, 1"
3|"1, 1, 1"
I have 3 tables with this mock data
Item *id, name*
1, coke
2, fanta
3, juice
Branch *id, name*
1, store
2, warehouse
3, shop
BranchItem *item_id, branch_id, qty*
1, 1, 100
1, 2, 30
2, 2, 10
I want to query for an item(coke for example) and get its quantity in all branches( even the ones it doesn't exist in, those should have NULL for qty column)
So the result should look like
1, coke, store, 100
1, coke, warehouse, 30
1, coke, shop, NULL
I have a query that can do this, but because of aliasing tables, I lose the ability to refer to the column of the result table. The parsing of the result is done in an ORM object which preferably shouldn't be rewritten
The query I have
Select * from item left join (select * from branch left join ( select * from branchitem where item_id = 1) branchitem on branch.id = branchitem.branch_id) JOINEDNAME on true where item.id = 1;
My question is I don't want to Elias the join of branch and brunch item as I lose the ability to refer to them separately in the ORM. How can this query be re-written so the tables retain their names?
You don't need to use subqueries:
SELECT Item.id,
Item.name,
Branch.name,
BranchItem.qty
FROM Item
CROSS JOIN Branch
LEFT JOIN BranchItem ON Item.id = BranchItem.item_id
AND Branch.id = BranchItem.branch_id
WHERE Item.id = 1; -- or put it into the branch join
I got two tables:
emails
raw_id | email | score
1, email1, 1
1, email2, 2
2, email3, 3
3, email4, 4
merged
raw_id1 | raw_id2
1, 2
How can I make a query that will show me one row for each distinct row_id with highest score, also if two raw_id are merged they will be considered as same one.
So for the above data, here is my expected result:
select score, email from emails join...
3, email3
4, email4
-
SELECT MAX(e1.raw_id, COALESCE(e2.raw_id, -1)),
MAX(e1.email),
MAX(MAX(e1.score, COALESCE(e2.score, -1)))
FROM emails e1 LEFT JOIN merged m
ON e1.raw_id = m.raw_id1
LEFT JOIN emails e2
ON e2.raw_id = m.raw_id2
GROUP BY MAX(e1.raw_id, COALESCE(e2.raw_id, -1))
Follow the link below for a running demo:
SQLFiddle
Note that this demo is for MySQL, because Fiddle doesn't offer the option for SQLite. The only change I had to make to the query was to replace SQLite's scalar MAX function with MySQL's GREATEST function.
SELECT max(score), email
FROM emails
LEFT JOIN
merged ON emails.raw_id = merged.raw_id1
GROUP BY coalesce(raw_id2, raw_id);
I have an application which has data spread accross 2 tables.
There is a main table Main which has columns - Id , Name, Type.
Now there is a Sub Main table that has columns - MainId(FK), StartDate,Enddate,city
and this is a 1 to many relation (each main can have multiple entries in submain).
Now I want to display columns Main.Id, City( as comma seperated from various rows for that main item from submain), min of start date(from submain for that main item) and max of enddate( from sub main).
I thought of having a function but that will slow things up since there will be 100k records. Is there some other way of doing this. btw the application is in asp.net. Can we have a sql query or some linq kind of thing ?
This is off the top of my head, but firstly I would suggest you create a user defined function in sql to create the city comma separated list string that accepts #mainid, then does the following:
DECLARE #listStr VARCHAR(MAX)
SELECT #listStr = COALESCE(#listStr+',' , '') + city
FROM submain
WHERE mainid = #mainid
... and then return #listStr which will now be a comma separated list of cities. Let's say you call your function MainIDCityStringGet()
Then for your final result you can simply execute the following
select cts.mainid,
cts.cities,
sts.minstartdate,
sts.maxenddate
from ( select distinct mainid,
dbo.MainIDCityStringGet(mainid) as 'cities'
from submain) as cts
join
( select mainid,
min(startdate) as 'minstartdate',
max(enddate) as 'maxenddate'
from submain
group by mainid ) as sts on sts.mainid = cts.mainid
where startdate <is what you want it to be>
and enddate <is what you want it to be>
Depending on how exactly you would like to filter by startdate and enddate you may need to put the where filter within each subquery and in the second subquery in the join you may then need to use the HAVING grouped filter. You did not clearly state the nature of your filter.
I hope that helps.
This will of course be in stored procedure. May need some debugging.
An alternative to creating a stored procedure is performing the complex operations on the client side. (untested):
var result = (from main in context.Main
join sub in context.SubMain on main.Id equals sub.MainId into subs
let StartDate = subs.Min(s => s.StartDate)
let EndDate = subs.Max(s => s.EndDate)
let Cities = subs.Select(s => s.City).Distinct()
select new { main.Id, main.Name, main.Type, StartDate, EndDate, Cities })
.ToList()
.Select(x => new
{
x.Id,
x.Name,
x.Type,
x.StartDate,
x.EndDate,
Cities = string.Join(", ", x.Cities.ToArray())
})
.ToList();
I am unsure how well this is supported in other implimentations of SQL, but if you have SQL Server this works a charm for this type of scenario.
As a disclaimer I would like to add that I am not the originator of this technique. But I immediately thought of this question when I came across it.
Example:
For a table
Item ID Item Value Item Text
----------- ----------------- ---------------
1 2 A
1 2 B
1 6 C
2 2 D
2 4 A
3 7 B
3 1 D
If you want the following output, with the strings concatenated and the value summed.
Item ID Item Value Item Text
----------- ----------------- ---------------
1 10 A, B, C
2 6 D, A
3 8 B, D
The following avoids a multi-statement looping solution:
if object_id('Items') is not null
drop table Items
go
create table Items
( ItemId int identity(1,1),
ItemNo int not null,
ItemValue int not null,
ItemDesc nvarchar(500) )
insert Items
( ItemNo,
ItemValue,
ItemDesc )
values ( 1, 2, 'A'),
( 1, 2, 'B'),
( 1, 6, 'C'),
( 2, 2, 'D'),
( 2, 4, 'A'),
( 3, 7, 'B'),
( 3, 1, 'D')
select it1.ItemNo,
sum(it1.ItemValue) as ItemValues,
stuff((select ', ' + it2.ItemDesc --// Stuff is just used to remove the first 2 characters, instead of a substring.
from Items it2 with (nolock)
where it1.ItemNo = it2.ItemNo
for xml path(''), type).value('.','varchar(max)'), 1, 2, '') as ItemDescs --// Does the actual concatenation..
from Items it1 with (nolock)
group by it1.ItemNo
So you see all you need is a sub query in your select that retrieves a set of all the values you need to concatenate and then use the FOR XML PATH command in that sub query in a clever way. It does not matter where the values you need to concatenate comes from you just need to retrieve them using the sub query.