Cannot convert value of type 'NSAttributedString.DocumentAttributeKey' - nsattributedstring

I have used the following code multiple times in the past and in one application it still compiles just fine:
let attrStr = try! NSAttributedString(data: myLabel.data(using: String.Encoding.unicode,allowLossyConversion: true)!, options: [ NSDocumentTypeDocumentAttribute: NSHTMLTextDocumentType], documentAttributes: nil)
Now, I've started a new app and I get the following error:
Cannot convert value of type 'NSAttributedString.DocumentAttributeKey' to expected dictionary key type 'NSAttributedString.DocumentReadingOptionKey'
Can't seem to find anything that is applicable to fixing this, has something changed?

Related

Not able to access certain JSON properties in Autoloader

I have a JSON file that is loaded by two different Autoloaders.
One uses schema evolution and besides replacing spaces in the json property names, writes the json directly to a delta table, and I can see all the values are there properly.
In the second one I am mapping to a defined schema and only use a subset of properties. So use a lot of withColumn and then a select to narrows to my defined column list.
Autoloader definition:
df = (spark
.readStream
.format('cloudFiles')
.option('cloudFiles.format', 'json')
.option('multiLine', 'true')
.option('cloudFiles.schemaEvolutionMode','rescue')
.option('cloudFiles.includeExistingFiles','true')
.option('cloudFiles.schemaLocation', bronze_schema)
.option('cloudFiles.inferColumnTypes', 'true')
.option('pathGlobFilter','*.json')
.load(upload_path)
.transform(lambda df: remove_spaces_from_columns(df))
.withColumn(...
Writer:
df.writeStream.format('delta') \
.queryName(al_stream_name) \
.outputMode('append') \
.option('checkpointLocation', checkpoint_path) \
.option('mergeSchema', 'true') \
.trigger(once = True) \
.table(bronze_table)
Issue is that some of the source columns are ok load and I get their values, and others are constantly null in the output table.
For example:
.withColumn('vl_rating', col('risk_severity.value')) # works
.withColumn('status', col('status.name')) # always null
...
.select(
'rating',
'status',
...
json is quite simple, these are all string values, they are always populated. The same code works against another simular json file in another autoloader without issue.
I have run out of ideas to fault find on this. My imports are minimal, outside of Autoloader the JSON loads fine.
e.g
%python
import pyspark.sql.functions as psf
jsontest = spark.read.option('inferSchema','true').json('dbfs:....json')
df = jsontest.withColumn('status', psf.col('status.name')).select('status')
display(df)
Results in the values of the status.name property of the json file
Any ideas would be greatly appreciated.
I have found generally what is causing this. Interesting cause!
I am scanning a whole directory of json files, and the schema evolves over time (as expected). But when I clear out the autoloader schema and checkpoint directories and only scan the latest json file it all works correctly.
So what I surmise is that something in schema evolution with the older json files causes Autoloader to get into a state where it will not put certain properties into the stream to the writer.
If anyone has any recommendation on how to implement some data quality analysis in an Autoloader I would be most appreciative if you would share.

Xcos throws "Undefined variable: scifunc_block_m" message in console

When I run a Xcos model containing a scifunc_block_m block like shown below
I get an error message relating to data dimensions inconsistency:
"Data dimensions are inconsistent:"
" Variable size=[1,1]"
"Block output size=[100,1]."
But when I double click in the block in order to see what can I change to make the dimensions correct I get a message in the console saying
Undefined variable: scifunc_block_m
What bugs me is that scifunc_block_m is not the name of any variable, but rather the name of the block itself like can be seen in the official docs.
Of course I double checked that nowhere in my function phase_shifter neither anywhere else I have any variable named like that.
I tried with Scilab 6.1.1 and 6.1.0 believing that it might be a bug from apparently not.
In your phase_shifter.sce file generating the input variable,
the signalIn variable does not comply with the From Workspace block requirements, whose documentation says that the input variable
must be a structure with time and values fields
.time must be a column vector, and in your case
.values must also be a column
So,
t = (0:1/fs:Npp/fs - 1/fs); // time vector
signalIn = A*%e^(%i*w*t);
should be replaced with
t = (0:1/fs:Npp/fs - 1/fs)'; // time column vector
signalIn = struct("time",t, "values",A*%e^(%i*w*t));
This fixes the inconsistent dimensions message.
In addition, i am not able to reproduce your issue about Undefined variable: scifunc_block_m. The parameters interface opens as expected.
You may get this kind of messages if you try to run some xcos parts out of xcos, without beforehand loading xcos-related libraries.
Then, we get an unclear "Output should be of complex type." message on the From workspace block.
By the way, you try to plot some complex values. Please have a look to the MATMAGPHI block before entering MUX: https://help.scilab.org/docs/6.1.1/en_US/MATMAGPHI.html

Reticulate AWS Cogntito

This is my Python code (that I've checked and it works):
from warrant.aws_srp import AWSSRP
def auth(USERNAME,PASSWORD):
client = boto3.client('cognito-idp',region_name=region_name)
aws = AWSSRP(username=USERNAME, password=PASSWORD, pool_id=POOL_ID,
client_id=CLIENT_ID,client=client)
try:
tokens = aws.authenticate_user()
return(tokens)
except Exception as e:
return(e)
I'm working with R in order to create a visual interface for doing some operation (including this one) and it is a requirement.
I use the reticulare R package to execute Python code. I tested it with some dummy code in order to check the correct functioning (and it is okay).
When i execute the above function by running:
reticulate::source_python(FILE_PATH)
py$auth(USERNAME,PASSWORD)
i get the following error:
An error occurred (InvalidParameterException) when calling the RespondToAuthChallenge operation: TIMESTAMP format should be EEE MMM d HH:mm:ss z yyyy in english.
I tried to search a lot but I found nothing, I suppose that can exist a sort of wrapper or formatter. Maybe someone as already face this problem...
Thank a lot of any help.

Anylogic - Error while drawing animation frame

I have a model and everything was working fine. i was just changing some values to try different scenarios and adding some graphs when I got the error msg:
Error while drawing animation frame. possibly created by dynamic properties of animation shapes
The only message I could get from the console window is:
Error during model creation:
java.base/java.lang.String cannot be cast to java.base/java.lang.Double
what could be the cause of this error?
Somewhere in your code, you might have attempted to convert a string into a double.
String Message = "1.20";
double res = (double)Message;
The above code is not allowed in Java
A condition like this might have occurred.
You are not allowed to put a string into a double variable:
double res = "1.2";

Compiler confused about mock object

I'm using OCMock to aid in Test Driven Development I am doing for an iPad application using Xcode. I have test code like this:
id mock = [OCMockObject mockForProtocol:#protocol(SomeProtocol)];
Vector direction = { 1.0f, 2.0f, 3.0f };
[[mock expect] setDirection:direction];
When I try to compile, I'm getting warnings and errors like this:
warning: multiple methods named 'setDirection:' found
error: sending'Vector' to parameter of incompatible type
'UISwipeGestureRecognizerDirection' (aka 'enum
UISwipeGestureRecognizerDirection')
Obviously the compiler is not able to determine what type of object the mock is supposed to be. I'm not sure how to specify that it should deal with the setDirection method from the SomeProtocol protocol instead of a setDirection method from another class.
What can be done to make a test case like this build successfully?
Qualifying the mock with a cast will eliminate the ambiguity:
[(id<SomeProtocol>)[mock expect] setDirection:direction];
For the OCMock 3 modern syntax:
id protocolMock = OCMProtocolMock(#protocol(MYProtocol));
OCMExpect([(id <MYProtocol>)protocolMock ambiguousMethod]);

Resources