Error while Fetching Hash value for a block chain - python-3.6

I am working on a sample blockchain implementation with dummy data. I have One method is createGenesisBlock() for creating first block and other method is Makeblock() for blockchain. The code is written in python 3.6
I am facing issue with Makeblock when retrieving the Hash Value created in Genesis Block and stored in a text file.
Below is the error Message :
parentHash=parentBlock[u'hash']
TypeError: string indices must be integers
Code Snippet are below :
*def makeBlock(self,txns, parentBlock):
parentBlock=parentBlock.strip(parentBlock[0])
parentblocklen=len(parentBlock)
parentBlock=parentBlock.strip(parentBlock[parentblocklen-1])
parentHash=parentBlock[u'hash']
blockNumber = parentBlock[u'contents'][u'blockNumber'] + 1
blockContents = {u'blockNumber': blockNumber, u'parentHash': parentHash, u'RCount': len(txns), 'records': txns}
blockHash = hashMe(blockContents)
block = {u'hash': blockHash, u'contents': blockContents}
return block*
Above, method makeBlock() is taking below string as 2nd argument . But while fetching the hash from the below string is resulting in error.
************string starts*********
{'hash': 'af12f6148f5d328d1272e81393f199a6', 'contents': {'blockNumber': 0, 'parentHash': None, 'RCount': 2, 'records': [{'Patient ID': (b'\xa6\xe0<"\x18\x18F~-.g\x1cX\xf3\r\x86\xc0\x04hUG\xa7PT\xa2\x85\'\x06S?\xab3\xf4T$I\xcf\x94\xcc \xc8\x1b\xfb\xfe\nTw!\x80q\x1ae1\x99\x9e\xbf\xbc^D\xea\xe9X\x06\x8f\x8es\xb3\x02sAdf07\xdc\xdd\xf0\x17M\xf5\xeep\xbd\xeb\x88V\xd6\x86\xd4\9e\xcf\x93\x06p\xc4\x1a\xe8\r\xc5\xa2\xd6j\xc4\xff\xc7\xfb\x81\xfaa_\xe0C\x97#f3Zu7\xb2\x07YO\xe3',), 'Insurance ID': (b'\xaa\x1bJ\x84z\xe9\r\x8e\`IC\x9f\xf9\x03\xf4\xf5/\x1ceXW\x01\xf3\x03\r~yTi\xff:\xa3\x05\xcf\x84xl\xf9r\x95C\xc1f\xe6\xec\xbc\tz\xa9\xd0J\xe6#\xe6\xc5\xf8\xbb\xb1\x05flY\xc6\xa3\xfdy+\xa2Eq\xbb\xebe\x8dm?404T\xe9\xb2\x85_\x89\xc4&\xa2<\xac\xbf\xc2\xd8z\xee\x9e\xd2\xfc\xf8\xa2<.(i\x07\x10o\x9f\xf9\xa9\x1bH\xda\xd4?\xd9\x7f\xb4G\xeb\x15\x13\xbaW\x9b\r\xc3',), 'DOS': "'19-09-2216'", 'Claim Amount': "'2280.00'", 'Claim ID': "'9348'", 'Provider ID': (b'q\x8a!\xfce\xb7\x8e\xeax\x8a\xc9h\xd4\\x02P\xb5j\xdc\xb2\xe5\xa4\xc6#\xbd\xda\x1e,;)\x83\xe7\x98\x15\xe2\xe7\xe4c\xd5\x9e\x99k\x8a\x97\xdd#\x803\xc0\xbbY3\xa7\xfe\xeb&\x0f\xa3/\x1e\x96\xdf}\x83\x00euX\xd4J\r\xd1\xd91\xb3\x89\xc9I\x94L\xf2U3L=\xb2\xa9+43\x95t^\x9a\x85\x02\xea\x16V\xac\xad\xbc"=.i\xab8?7\x1di\x83\x18\xee\x9aM\xae\x84\xf8p\x9ae\xf9\xb0\x82',), 'URL': "'www.yahoo.com'"}, {'Patient ID': (b'\xbf\xec\rk\xaa\x99\xde\t\xfe\xf7R\x0fu6\x0fz\x9e\x01\xa6\xbb\x80\xfeyh\xe7\n\xeb\x03H\x01/ \xfc\xda\x8dH\xc9\x12R\xe1.E\xc3\xb1,\x0b)\x7f\xca\xc73"\xa8\x0c{\xe4Q\x95pi\x97\xfaN\xc3^\xd3\xf1\x0buK\xe3c\x06\xd1\xfe\xa6\xdd\xafO\xfe#\x17\xe9\xe6\xd6\xe1\xb3s#{\x9fV\xba\x13\xae\xa7q\x8b\x08\x9b\xb1\xa0\x0eA\xf3M\x96\xf6\xbbhs[\xbd/Z:\x87Y\xe8\xb9\xe0\xcb\xbb\xfb\x0b\x1d#\xb0',), 'Insurance ID': (b'\x06\x11\xaa\x08\xde9\xbcd\x91\xe28\xd6J[\xf6\x1a\xbb{\xc9f\x8aa\xdd\x86\xbf\x17\xdf\xaaQiZq\x91\xbb\xf1F\x1f[n0i\xc1\x96\xf2\x9f\xf1.\xde\x11P\xb7\xcd\x0e\xab\x97\xd3W\xa6\xd0\xc0\xc7\xbe\xb7\x8a\xd2\xe9\x84\x9bM\xeb\x0e\x8f\x91\t\xd9\xecq\xad\xd8w\r\xc4\x08\x0f\xa5\x99\xc3)\xa7\x8b\xa0\xa5\xc6\xbf\xce\xfa\xe6\x9f9\x81)\xfevJ\xad\x1d\xc9%\xe0\xe1(\xd5\xd7\x19g\x1f\xe1{\x1c\x1e%\x10 \x12\xc6\x03\xce',), 'DOS': "'10-11-2012'", 'Claim Amount': "'1230.00'", 'Claim ID': "'98009'", 'Provider ID': (b'\x81/O\xddKPk6XD\x82g\x97\x80\x87\x92\xc5\xfa\x07v \xf9\xe43\xd9\x00z\xf8\xe9\x1e3"\xa1\xeb\x039\xda\t:K\r\xda\xbcx\x84\x95\xdf\xa2\xe9\xdcU\xd9\C41\xb7\xf0\x07\xf5euy\xd3Ab\x10teE\xf2\xf2\x10\xc0\x1c\x16$V7\xf1\x8d\x16\x8a\x14\x02\x16\xff\xe3\xad\xa3#`\x13\xd0\xef\x18\x83\x13D\x91\x05>\x13\x8aY\xdb\xfeN\x99\xef\x9d>\xc9\xf2\xfbMj\xd1\x00E\xd5\xdaYW\x1d\x0e\xca\xa0',), 'URL': "'www.oracle.com'"}]}}*
************string ends *********
NOTE: Surprizingly excuting the same on python shell. I am able to fetch parentHash value into any variable.
parentHash=parentBlock[u'hash'] (executing on shell returns the hash value successfully: af12f6148f5d328d1272e81393f199a6)
parentHash=parentBlock[u'hash'] (executing via python script is resulting in ERROR (TypeError: string indices must be integers)
Could someone please help where is the issue and why the same code works fine in python shell but throws a error in Method while executing?

Related

Airflow - Druid Operator is not getting host

I have a question for Druid Operator. I see that this test is successful, but I take a this error.
File "/home/airflow/.local/lib/python3.7/site-packages/requests/sessions.py", line 792, in get_adapter
raise InvalidSchema(f"No connection adapters were found for {url!r}")
I take a dag like this
DRUID_CONN_ID = "druid_ingest_conn_id"
ingestion = DruidOperator(
task_id='ingestion',
druid_ingest_conn_id=DRUID_CONN_ID,
json_index_file='ingestion.json'
)
Also I change the dag to look overload but I take same error.
Another step I change the type to like this but I have a different error
ingestion_2 = SimpleHttpOperator(
task_id='test_task',
method='POST',
http_conn_id=DRUID_CONN_ID,
endpoint='/druid/indexer/v1/task',
data=json.dumps(read_file),
dag=dag,
do_xcom_push=True,
headers={
'Content-Type': 'application/json'
},
response_check=lambda response: response.json()['Status'] == 200,
)
{"error":"Missing type id when trying to resolve subtype of [simple type, class org.apache.druid.indexing.common.task.Task]: missing type id property 'type'\n at [Source: (org.eclipse.jetty.server.HttpInputOverHTTP); line: 1, column: 1]"}
Finally I try giving Http connection in Druid Operator but I have a error like this
raise AirflowException(f'Did not get 200 when submitting the Druid job to {url}')
So that I am confused. I need a help. Thanks for answers.
P.S: We use 2.3.3 Airflow version

Pass pyfiles and arguments to DataProcPySparkOperator

I am trying to pass arguments and zipped pyfiles to a temporary Dataproc Cluster in Composer
spark_args = {
'conn_id': 'spark_default',
'num_executors': 2,
'executor_cores': 2,
'executor_memory': '2G',
'driver_memory': '2G',
}
task = dataproc_operator.DataProcPySparkOperator(
task_id='spark_preprocess_{}'.format(name),
project_id=PROJECT_ID,
cluster_name=CLUSTER_NAME,
region='europe-west4',
main='gs://my-bucket/dist/main.py',
pyfiles='gs://my-bucket/dist/jobs.zip',
dataproc_pyspark_properties=spark_args,
arguments=['--name', 'test', '--date', self.date_exec],
dag=subdag
)
But I get the following error, any idea how to correctly format the arguments?
Invalid value at 'job.pyspark_job.properties[1].value' (TYPE_STRING)
As pointed out in the comment, the issues is that spark_args has non-string values, but it should contain only strings per error message:
Invalid value at 'job.pyspark_job.properties[1].value' (TYPE_STRING)

Resource 7bed8adc-9ed9-49dc-b15e-6660e2fc3285 transitioned to failure state ERROR when use openstacksdk to create_server

When I create the openstack server, I get bellow Exception:
Resource 7bed8adc-9ed9-49dc-b15e-6660e2fc3285 transitioned to failure state ERROR
My code is bellow:
server_args = {
"name":server_name,
"image_id":image_id,
"flavor_id":flavor_id,
"networks":[{"uuid":network.id}],
"admin_password": admin_password,
}
try:
server = user_conn.conn.compute.create_server(**server_args)
server = user_conn.conn.compute.wait_for_server(server)
except Exception as e: # there I except the Exception
raise e
When create_server, my server_args data is bellow:
{'flavor_id': 'd4424892-4165-494e-bedc-71dc97a73202', 'networks': [{'uuid': 'da4e3433-2b21-42bb-befa-6e1e26808a99'}], 'admin_password': '123456', 'name': '133456', 'image_id': '60f4005e-5daf-4aef-a018-4c6b2ff06b40'}
My openstacksdk version is 0.9.18.
In the end, I find the flavor data is too big for openstack compute node, so I changed it to a small flavor, so I create success.

erlang-sqlite3 sqlite3:table_info error

I tried to extract table info from sqlite database using sqlite3:table_info/2 function and got an error message.
{ok,Pid} = sqlite3:open(db3).
Sql = <<"CREATE TABLE test (
id INTEGER PRIMARY KEY,
ts TEXT default (datetime('now')),
key TEXT,
val TEXT
);">>.
sqlite3:sql_exec(db3,Sql).
Check table list:
sqlite3:list_tables(db3).
[test]
Try to get table info:
sqlite3:table_info(db3,test).
And now error message:
`=ERROR REPORT==== 1-Mar-2015::19:37:46 ===
** Generic server db3 terminating
** Last message in was {table_info,test}
** When Server state == {state,#Port,
[{file,"../tmp/db3.sqlite"}],
{dict,0,16,16,8,80,48,
{[],[],[],[],[],[],[],[],[],[],[],[],[],
[],[],[]},
{{[],[],[],[],[],[],[],[],[],[],[],[],[],
[],[],[]}}}}
** Reason for termination ==
** {function_clause,[{sqlite3,build_constraints,
[["INTEGER","PRIMARY","KEY"]],
[{file,"src/sqlite3.erl"},{line,1169}]},
{sqlite3,build_table_info,2,
[{file,"src/sqlite3.erl"},{line,1166}]},
{sqlite3,handle_call,3,
[{file,"src/sqlite3.erl"},{line,833}]},
{gen_server,try_handle_call,4,
[{file,"gen_server.erl"},{line,607}]},
{gen_server,handle_msg,5,
[{file,"gen_server.erl"},{line,639}]},
{proc_lib,init_p_do_apply,3,
[{file,"proc_lib.erl"},{line,237}]}]}
** exception exit: function_clause
in function sqlite3:build_constraints/1
called as sqlite3:build_constraints(["INTEGER","PRIMARY","KEY"])
in call from sqlite3:build_table_info/2 (src/sqlite3.erl, line 1166)
in call from sqlite3:handle_call/3 (src/sqlite3.erl, line 833)
in call from gen_server:try_handle_call/4 (gen_server.erl, line 607)
in call from gen_server:handle_msg/5 (gen_server.erl, line 639)
in call from proc_lib:init_p_do_apply/3 (proc_lib.erl, line 237)
Any ideas?
I've fixed the problem with INTEGER PRIMARY KEY. The default is harder to support, but I've added a fallback so it doesn't crash, at least. As #CL mentions, this parsing is unreliable anyway (since SQLite unfortunately doesn't expose any way to use its own parser).

Custom command result

When invoking a custom command, I noticed that only the logs are displayed. For example, if my Custom Comand script contains a retrun statement return "great custom command", I can't find it in the result. Both in API Java client or shell execution cases.
What can I do to be able to retrieve that result at the end of an execution?
Thanks.
Command definition in service description file:
customCommands ([
"getText" : "getText.groovy"
])
getText.groovy file content:
def text = "great custom command"
println "trying to get a text"
return text
Assuming that you service file contains the following :
customCommands ([
"printA" : {
println "111111"
return "222222"
},
"printB" : "fileB.groovy"
])
And fileB.groovy contains the following code :
println "AAAAAA"
return "BBBBBB"
Then if you run the following command : invoke yourService printA
You will get this :
Invocation results:
1: OK from instance #1..., Result: 222222
invocation completed successfully.
and if you run the following command : invoke yourService printB
You will get this :
Invocation results:
1: OK from instance #1..., Result: AAAAAA
invocation completed successfully.
So if your custom command's implementation is a Groovy closure, then its result is its return value.
And if your custom command's implementation is an external Groovy file, then its result is its last statement output.
HTH,
Tamir.

Resources