Union of dict with typed keys not compatible with an empty dict - mypy

I'd like to type a dict as either a dictionary where the all the keys are integers or a dictionary where all the keys are strings.
However, when I read mypy (v0.991) on the following code:
from typing import Union, Any
special_dict: Union[dict[int, Any], dict[str, Any]]
special_dict = {}
I get an Incompatible types error.
test_dict_nothing.py:6: error: Incompatible types in assignment (expression has type "Dict[<nothing>, <nothing>]", variable has type "Union[Dict[int, Any], Dict[str, Any]]") [assignment]
Found 1 error in 1 file (checked 1 source file)
How do I express my typing intent.

This is a mypy bug, already reported here with priority 1. There is a simple workaround suggested:
from typing import Any
special_dict: dict[str, Any] | dict[int, Any] = dict[str, Any]()
This code typechecks successfully, because you basically give mypy a hint with more specific type of your dictionary. It may be any matching type and won't affect further checking, because you broaden the final type with an explicit type hint.

Related

why do I need parenthesis in this dictionary initializer, in F#

with these types:
type A =
| AA
| AB
type B =
Dictionary<int, int>()
I initialize a dictionary:
Dictionary<A, B>(dict [ (A.AA, B()); (A.AB, B()) ])
but I do not understand why I need to put parenthesis after B, in the initialization code.
the following:
Dictionary<A, B>(dict [ (A.AA, B); (A.AB, B) ])
will not compile. I understand that 'B' may represent the type and 'B()' an instance of it, but I don't understand why the '()' would represent an instance?
As an additional question:
type B =
Dictionary<int, int>()
and
type B =
Dictionary<int, int>
both seem to work. Is any of the two preferred, and, if so, why?
First of all, the declaration type B = Dictionary<int, int>() does not work for me. I get an error "Unexpected symbol '(' in member definition", exactly as I would expect. Are you sure it's working for you? Which version of F# are you using?
The type Dictionary<_,_> is a class. Classes are not the same as discriminated unions (which the type A is). They are a different sort of type.
In particular, to create a value of a class type, one needs to call a constructor and pass it some parameters. This is exactly what you're doing in your very own code:
Dictionary<A, B> (dict [ (A.AA, B()); (A.AB, B()) ])
^--------------^ ^---------------------------------^
| |
constructor |
|
parameter passed to the constructor
Some classes have multiple constructors. Dictionary is one of such types.
Some constructors have no parameters, but you still have to call them. This is what you do with empty parens.
F# models parameterless .NET methods and constructors as functions that have a single parameter, and that parameter is of type unit. This is what you're doing when you say B()
B ()
^ ^^
| |
| single parameter of type unit
|
constructor
If you just say B without a parameter, then what you get is a function of type unit -> B - that is a function that expects a single parameter of type unit and when you pass it such parameter, it would return you a value of type B.

how to solve InterfaceError when importing a variable file which has a depth-1 variable being a list?

I am importing a variable file (eg, variables.json) into airflow, which has one depth-1 variable being a list like this:
{...
"var1": ["value1", "value2"],
...
}
I tried 3 methods:
1). in command line: airflow variables -i variables.json
2). in airflow UI, admin -> Variables -> Choose file -> Import Variables
3). in airflow UI, admin -> Variables -> Create -> input key (ie, Var1) and value (ie, ["value1", "value2"]) respectively.
Method 1 and 2 failed, but 3 succeeded.
method 1 returns info like "15 of 27 variables successfully updated.", which means some variables are not successfully updated
method 2 shows error:
InterfaceError: (sqlite3.InterfaceError) Error binding parameter 1 - probably unsupported type. [SQL: u'INSERT INTO variable ("key", val, is_encrypted) VALUES (?, ?, ?)'] [parameters: (u'var1', [u'value1', u'value2'], 0)] (Background on this error at: http://sqlalche.me/e/rvf5)
I search and found this thread: InterfaceError:(sqlte3.InterfaceError)Error binding parameter 0.
It seems that sqlite does not support list type.
I also tested a case having nesting variable (here for example, var2_1) being list like this
{...
"var2": {"var2_1": ["A","B"]},
...
}
all of above 3 methods are working.
So my questions are:
(1) why method 1 and 2 failed, but 3 succeeded for depth-1 variable being a list?
(2) why nesting (depth-2,3,...) variable can be a list without any issue?
If you're running Airflow 1.10.3, the import_helper used in the CLI only serializes dict values to JSON.
def import_helper(filepath):
#...
for k, v in d.items():
if isinstance(v, dict):
Variable.set(k, v, serialize_json=True)
else:
Variable.set(k, v)
n += 1
except Exception:
pass
finally:
print("{} of {} variables successfully updated.".format(n, len(d)))
https://github.com/apache/airflow/blob/1.10.3/airflow/bin/cli.py#L376
The WebUI importer also does the same thing with dict values.
models.Variable.set(k, v, serialize_json=isinstance(v, dict))
https://github.com/apache/airflow/blob/1.10.3/airflow/www/views.py#L2073
However, current revision (1.10.4rc1) shows that non string values will be serialized to string in future releases in CLI import_helper
Variable.set(k, v, serialize_json=not isinstance(v, six.string_types))
https://github.com/apache/airflow/blob/1.10.4rc1/airflow/bin/cli.py
...and WebUI importer.
models.Variable.set(k, v, serialize_json=not isinstance(v, six.string_types))
https://github.com/apache/airflow/blob/1.10.4rc1/airflow/www/views.py#L2118
Currently, it will serve you to perform the serialization of non string values in your import process when you do it with the CLI or WebUI importer.
...and when you retrieve value for such variable pass the option to deserialize them e.g.
Variable.get('some-key', deserialize_json=True)
In your variable.json, ["value1", "value2"] is an array, where Airflow expects a value/string or a JSON.
It would work if you cast that array into a string in your JSON.

How to get the correct signatures order of annotations in methods when performing overriding

I am trying to fix some methods annotations on magic and normal methods. For example, I have some cases like:
```
class Headers(typing.Mapping[str, str]):
...
def __contains__(self, key: str) -> bool:
...
return False
def keys(self) -> typing.List[str]:
...
return ['a', 'b']
```
and when I run mypy somefile.py --disallow-untyped-defs I have the following errors:
error: Argument 1 of "__contains__" incompatible with supertype "Mapping"
error: Argument 1 of "__contains__" incompatible with supertype "Container"
error: Return type of "keys" incompatible with supertype "Mapping"
What I understand is that I need to override the methods using the #override decorator and I need to respect the order of inheritance. Is it correct?
If my assumption is correct, Is there any place in which I can find the exact signatures of the parent classes?
After asking the question on mypy, the answer was:
Subclassing typing.Mapping[str, str], I'd assume that the function
signature for the argument key in contains ought to match the
generic type?
contains isn't a generic method -- it's defined to have the type signature contains(self, key: object) -> bool. You can check this on typeshed. The reason why contains is defined this way is because doing things like 1 in {"foo": "bar"} is technically legal.
Subclassing def contains(self, key) to def contains(self, key:
str) is in any case more specific. A more specific subtype doesn't
violate Liskov, no?
When you're overriding a function, it's ok to make the argument types more general and the return types more specific. That is, the argument types should be contravariant and the return types covariant.
If we did not follow the rule, we could end up introducing bugs in our code. For example:
class Parent:
def foo(self, x: object) -> None: ...
class Child(Parent):
def foo(self, x: str) -> None: ...
def test(x: Parent) -> None:
x.foo(300) # Safe if 'x' is actually a Parent, not safe if `x` is actually a Child.
test(Child())
Because we broke liskov, passing in an instance of Child into test ended up introducing a bug.
Basically if I use Any for key on __contains__ method is correct and mypy won't complaint :
def __contains__(self, key: typing.Any) -> bool:
...
return False
You can follow the conversation here

Dictionary in constructor for a mutable struct in Julia

Is it possible to initialize a mutable struct with a variable which is a dict.I am trying the following:
mutable struct Global
speciesCnt::Int64
chromosomeCnt::Int64
nodeCnt::Int64
innov_number::Int64
innovations::Dict{(Int64,Int64),Int64}
cf::Config
function Global(cf::Config)
new(0,0,0,0,Dict{(Int64,Int64),Int64}(),cf) # global dictionary
end
end
however, when I run it I get the following error:
LoadError: TypeError: in Type, in parameter, expected Type, got Tuple{DataType,DataType}.
Any help is greatly appreciated.
I am using Julia v 1.0
A proper type signature for your dict is:
Dict{Tuple{Int64,Int64},Int64}
The easiest way to learn how type signatures in Julia look like is to create an object of the desired type and use typeof function to display its type:
julia> typeof(Dict((1,2)=>3))
Dict{Tuple{Int64,Int64},Int64}

Why is the return type of Deref::deref itself a reference?

I was reading the docs for Rust's Deref trait:
pub trait Deref {
type Target: ?Sized;
fn deref(&self) -> &Self::Target;
}
The type signature for the deref function seems counter-intuitive to me; why is the return type a reference? If references implement this trait so they can be dereferenced, what effect would this have at all?
The only explanation that I can come up with is that references don't implement Deref, but are considered "primitively dereferenceable". However, how would a polymorphic function which would work for any dereferenceable type, including both Deref<T> and &T, be written then?
that references don't implement Deref
You can see all the types that implement Deref, and &T is in that list:
impl<'a, T> Deref for &'a T where T: ?Sized
The non-obvious thing is that there is syntactical sugar being applied when you use the * operator with something that implements Deref. Check out this small example:
use std::ops::Deref;
fn main() {
let s: String = "hello".into();
let _: () = Deref::deref(&s);
let _: () = *s;
}
error[E0308]: mismatched types
--> src/main.rs:5:17
|
5 | let _: () = Deref::deref(&s);
| ^^^^^^^^^^^^^^^^ expected (), found &str
|
= note: expected type `()`
found type `&str`
error[E0308]: mismatched types
--> src/main.rs:6:17
|
6 | let _: () = *s;
| ^^ expected (), found str
|
= note: expected type `()`
found type `str`
The explicit call to deref returns a &str, but the operator * returns a str. It's more like you are calling *Deref::deref(&s), ignoring the implied infinite recursion (see docs).
Xirdus is correct in saying
If deref returned a value, it would either be useless because it would always move out, or have semantics that drastically differ from every other function
Although "useless" is a bit strong; it would still be useful for types that implement Copy.
See also:
Why does asserting on the result of Deref::deref fail with a type mismatch?
Note that all of the above is effectively true for Index and IndexMut as well.
The compiler knows only how to dereference &-pointers - but it also knows that types that implement Deref trait have a deref() method that can be used to get an appropriate reference to something inside given object. If you dereference an object, what you actually do is first obtain the reference and only then dereference it.
If deref() returned a value, it would either be useless because it would always move out, or have semantics that drastically differ from every other function which is not nice.

Resources