contentmirror integration with plone 4 - plone

I am trying to get the contentmirror product to work with a plone4 site. I've created a buildout from scratch and only included the contentmirror packages as indicated by this link:
http://code.google.com/p/contentmirror/wiki/Installation
I've set up the tables in a mysql database according to the instructions included in the documentation.
When I create my plone site, I get the following error:
2011-04-19 15:00:56,768 INFO sqlalchemy.engine.base.Engine.0x...8d2c SELECT DATABASE()
2011-04-19 15:00:56 INFO sqlalchemy.engine.base.Engine.0x...8d2c SELECT DATABASE()
2011-04-19 15:00:56,768 INFO sqlalchemy.engine.base.Engine.0x...8d2c ()
2011-04-19 15:00:56 INFO sqlalchemy.engine.base.Engine.0x...8d2c ()
2011-04-19 15:00:56,772 INFO sqlalchemy.engine.base.Engine.0x...8d2c SHOW VARIABLES LIKE 'character_set%%'
2011-04-19 15:00:56 INFO sqlalchemy.engine.base.Engine.0x...8d2c SHOW VARIABLES LIKE 'character_set%%'
2011-04-19 15:00:56,773 INFO sqlalchemy.engine.base.Engine.0x...8d2c ()
2011-04-19 15:00:56 INFO sqlalchemy.engine.base.Engine.0x...8d2c ()
2011-04-19 15:00:56,774 INFO sqlalchemy.engine.base.Engine.0x...8d2c SHOW VARIABLES LIKE 'lower_case_table_names'
2011-04-19 15:00:56 INFO sqlalchemy.engine.base.Engine.0x...8d2c SHOW VARIABLES LIKE 'lower_case_table_names'
2011-04-19 15:00:56,775 INFO sqlalchemy.engine.base.Engine.0x...8d2c ()
2011-04-19 15:00:56 INFO sqlalchemy.engine.base.Engine.0x...8d2c ()
2011-04-19 15:00:56,776 INFO sqlalchemy.engine.base.Engine.0x...8d2c SHOW COLLATION
2011-04-19 15:00:56 INFO sqlalchemy.engine.base.Engine.0x...8d2c SHOW COLLATION
2011-04-19 15:00:56,777 INFO sqlalchemy.engine.base.Engine.0x...8d2c ()
2011-04-19 15:00:56 INFO sqlalchemy.engine.base.Engine.0x...8d2c ()
2011-04-19 15:00:56,781 INFO sqlalchemy.engine.base.Engine.0x...8d2c SHOW VARIABLES LIKE 'sql_mode'
2011-04-19 15:00:56 INFO sqlalchemy.engine.base.Engine.0x...8d2c SHOW VARIABLES LIKE 'sql_mode'
2011-04-19 15:00:56,781 INFO sqlalchemy.engine.base.Engine.0x...8d2c ()
2011-04-19 15:00:56 INFO sqlalchemy.engine.base.Engine.0x...8d2c ()
2011-04-19 15:00:56,784 INFO sqlalchemy.engine.base.Engine.0x...8d2c BEGIN (implicit)
2011-04-19 15:00:56 INFO sqlalchemy.engine.base.Engine.0x...8d2c BEGIN (implicit)
2011-04-19 15:00:56,790 INFO sqlalchemy.engine.base.Engine.0x...8d2c INSERT INTO content (id, content_uid, object_type, status, portal_type, folder_position, container_id, path, relative_path, title, description, subject, location, contributors, creators, creation_date, modification_date, effectivedate, expirationdate, language, rights, excludefromnav) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
2011-04-19 15:00:56 INFO sqlalchemy.engine.base.Engine.0x...8d2c INSERT INTO content (id, content_uid, object_type, status, portal_type, folder_position, container_id, path, relative_path, title, description, subject, location, contributors, creators, creation_date, modification_date, effectivedate, expirationdate, language, rights, excludefromnav) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
2011-04-19 15:00:56,791 INFO sqlalchemy.engine.base.Engine.0x...8d2c ('Members', 'f4363fe1b35567b62fc3283940928e66', 'ATFolderPeer', 'published', 'Folder', 62, None, '/Plone/Members', 'Members', 'Users', 'Site Users', '', '', '', 'admin', datetime.datetime(2011, 4, 19, 15, 0, 56, 349822), datetime.datetime(2011, 4, 19, 15, 0, 56, 391794), None, None, 'en', '', 0)
2011-04-19 15:00:56 INFO sqlalchemy.engine.base.Engine.0x...8d2c ('Members', 'f4363fe1b35567b62fc3283940928e66', 'ATFolderPeer', 'published', 'Folder', 62, None, '/Plone/Members', 'Members', 'Users', 'Site Users', '', '', '', 'admin', datetime.datetime(2011, 4, 19, 15, 0, 56, 349822), datetime.datetime(2011, 4, 19, 15, 0, 56, 391794), None, None, 'en', '', 0)
/plone/dev_PloneCache/eggs/SQLAlchemy-0.6.7-py2.6.egg/sqlalchemy/engine/default.py:299: Warning: Field 'content_id' doesn't have a default value
cursor.execute(statement, parameters)
2011-04-19 15:00:56,795 INFO sqlalchemy.engine.base.Engine.0x...8d2c INSERT INTO content (id, content_uid, object_type, status, portal_type, folder_position, container_id, path, relative_path, title, description, subject, location, contributors, creators, creation_date, modification_date, effectivedate, expirationdate, language, rights, excludefromnav) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
2011-04-19 15:00:56 INFO sqlalchemy.engine.base.Engine.0x...8d2c INSERT INTO content (id, content_uid, object_type, status, portal_type, folder_position, container_id, path, relative_path, title, description, subject, location, contributors, creators, creation_date, modification_date, effectivedate, expirationdate, language, rights, excludefromnav) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)
2011-04-19 15:00:56,796 INFO sqlalchemy.engine.base.Engine.0x...8d2c ('events', '5d2236341ffc7568c0550a3ebf3cefb8', 'ATFolderPeer', 'published', 'Folder', 61, None, '/Plone/events', 'events', 'Events', 'Site Events', '', '', '', 'admin', datetime.datetime(2011, 4, 19, 15, 0, 56, 118873), datetime.datetime(2011, 4, 19, 15, 0, 56, 119133), None, None, 'en', '', 0)
2011-04-19 15:00:56 INFO sqlalchemy.engine.base.Engine.0x...8d2c ('events', '5d2236341ffc7568c0550a3ebf3cefb8', 'ATFolderPeer', 'published', 'Folder', 61, None, '/Plone/events', 'events', 'Events', 'Site Events', '', '', '', 'admin', datetime.datetime(2011, 4, 19, 15, 0, 56, 118873), datetime.datetime(2011, 4, 19, 15, 0, 56, 119133), None, None, 'en', '', 0)
2011-04-19 15:00:56,797 INFO sqlalchemy.engine.base.Engine.0x...8d2c ROLLBACK
2011-04-19 15:00:56 INFO sqlalchemy.engine.base.Engine.0x...8d2c ROLLBACK
2011-04-19 15:00:56 ERROR Zope.SiteErrorLog 1303239656.80.428376990883 http://aktplone02:51002/##plone-addsite
Traceback (innermost last):
Module ZPublisher.Publish, line 135, in publish
Module Zope2.App.startup, line 291, in commit
Module transaction._manager, line 93, in commit
Module transaction._transaction, line 316, in commit
Module transaction._transaction, line 366, in _callBeforeCommitHooks
Module ore.contentmirror.operation, line 212, in flush
Module sqlalchemy.orm.session, line 1392, in flush
Module sqlalchemy.orm.session, line 1473, in _flush
Module sqlalchemy.orm.unitofwork, line 302, in execute
Module sqlalchemy.orm.unitofwork, line 446, in execute
Module sqlalchemy.orm.mapper, line 1884, in _save_obj
Module sqlalchemy.engine.base, line 1191, in execute
Module sqlalchemy.engine.base, line 1271, in _execute_clauseelement
Module sqlalchemy.engine.base, line 1302, in __execute_context
Module sqlalchemy.engine.base, line 1401, in _cursor_execute
Module sqlalchemy.engine.base, line 1394, in _cursor_execute
Module sqlalchemy.engine.default, line 299, in do_execute
Module MySQLdb.cursors, line 174, in execute
Module MySQLdb.connections, line 36, in defaulterrorhandler
IntegrityError: (IntegrityError) (1062, "Duplicate entry '0' for key 1") 'INSERT INTO content (id, content_uid, object_type, status, portal_type, folder_position, container_id, path, relative_path, title, description, subject, location, contributors, creators, creation_date, modification_date, effectivedate, expirationdate, language, rights, excludefromnav) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)' ('events', '5d2236341ffc7568c0550a3ebf3cefb8', 'ATFolderPeer', 'published', 'Folder', 61, None, '/Plone/events', 'events', 'Events', 'Site Events', '', '', '', 'admin', datetime.datetime(2011, 4, 19, 15, 0, 56, 118873), datetime.datetime(2011, 4, 19, 15, 0, 56, 119133), None, None, 'en', '', 0)
2011-04-19 15:00:56 ERROR ZServerPublisher exception caught
Traceback (most recent call last):
File "/plone/dev_PloneCache/eggs/Zope2-2.12.13-py2.6-linux-i686.egg/ZServer/PubCore/ZServerPublisher.py", line 31, in __init__
response=b)
File "/plone/dev_PloneCache/eggs/Zope2-2.12.13-py2.6-linux-i686.egg/ZPublisher/Publish.py", line 438, in publish_module
environ, debug, request, response)
File "/plone/dev_PloneCache/eggs/Zope2-2.12.13-py2.6-linux-i686.egg/ZPublisher/Publish.py", line 264, in publish_module_standard
if request is not None: request.close()
File "/plone/dev_PloneCache/eggs/Zope2-2.12.13-py2.6-linux-i686.egg/ZPublisher/BaseRequest.py", line 215, in close
notify(EndRequestEvent(None, self))
File "/plone/dev_PloneCache/eggs/zope.event-3.4.1-py2.6.egg/zope/event/__init__.py", line 23, in notify
subscriber(event)
File "/plone/dev_PloneCache/eggs/zope.component-3.7.1-py2.6.egg/zope/component/event.py", line 26, in dispatch
for ignored in zope.component.subscribers(event, None):
File "/plone/dev_PloneCache/eggs/zope.component-3.7.1-py2.6.egg/zope/component/_api.py", line 138, in subscribers
return sitemanager.subscribers(objects, interface)
File "/plone/dev_PloneCache/eggs/zope.component-3.7.1-py2.6.egg/zope/component/registry.py", line 323, in subscribers
return self.adapters.subscribers(objects, provided)
AttributeError: adapters
So the site fails to get created. I've tried dropping the table, but another error pops up indicating it cannot find a table, so I think I'm in the right direction.
I'm trying to see if this is a viable means to store content to a sql database as some pages have indicated. I am definitely missing something here. Does anyone have any experience using this product to help me out?
Thanks.
Young Kim

Disclaimer: I have no experience using ContentMirror, so I might be totally off-target, but from what you posted it seems that the 'id' value is not being autoincremented for table 'content'.
In SQLAlchemy, this is caused by an erroneous setting that overrides the default behaviour, something like
Table('content', metadata,
Column('id', Integer, primary_key=True, autoincrement=False),
...
)
In the SQL file ('mirror.sql') you generated using the 'ddl.py' script, check that the corrisponding database schema definition for table 'content' has the attribute 'AUTO_INCREMENT' set to 'TRUE' for column 'id', as in the following example:
CREATE TABLE `content` (
`id` int(11) NOT NULL AUTO_INCREMENT PRIMARY KEY,
...
)

Related

Airflow:SnowflakeOperator:Assign values from previous statement in SQL query

Snowflake:Assign values from previous statement in SQL query
Requirement: Assign values from the previous statement to the next statement in SQL query , as I run the query in SnowflakeOperator in Airflow
SQL:
BEGIN
app = 'abc';
env = select current_database();
start_time = select current_timestamp()::timestamp_ntz(9);
end_time = select current_timestamp()::timestamp_ntz(9);
duration = (end_time.getTime() - start_time.getTime()) / 1000;
insert into proc_runtimes
(env, app, task, start_time, end_time, duration, message)
values
(env, app, 'Job Start', start_time.toISOString(), end_time.toISOString(), duration, log_message]})
END
EDIT:
Requirement: Assign values from the previous statement to the next statement in SQL query, as I run the query in SnowflakeOperator in Airflow
Error: Airflow SnowflakeOperator not able to execute the anonymous block statement in the SQL file
SQL:
BEGIN
let app := 'abc';
let env := current_database();
let start_time := current_timestamp()::timestamp_ntz(9);
let end_time := current_timestamp()::timestamp_ntz(9);
let duration := DATEDIFF(seconds, end_time, start_time);
let log_message := 'some log';
INSERT INTO proc_runtimes
(env, app, task_name, start_time, end_time, duration, message)
SELECT
:env, :app, 'Job Start', :start_time, :end_time, :duration, :log_message;
END;
Error:
2022-08-16, 19:38:43 UTC] {cursor.py:696} INFO - query: [BEGIN let env := current_database();]
[2022-08-16, 19:38:43 UTC] {cursor.py:720} INFO - query execution done
[2022-08-16, 19:38:43 UTC] {connection.py:509} INFO - closed
[2022-08-16, 19:38:44 UTC] {connection.py:512} INFO - No async queries seem to be running, deleting session
[2022-08-16, 19:38:44 UTC] {taskinstance.py:1889} ERROR - Task failed with exception
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/snowflake/operators/snowflake.py", line 120, in execute
execution_info = hook.run(self.sql, autocommit=self.autocommit, parameters=self.parameters)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/snowflake/hooks/snowflake.py", line 301, in run
cur.execute(sql_statement)
File "/home/airflow/.local/lib/python3.7/site-packages/snowflake/connector/cursor.py", line 782, in execute
self.connection, self, ProgrammingError, errvalue
File "/home/airflow/.local/lib/python3.7/site-packages/snowflake/connector/errors.py", line 273, in errorhandler_wrapper
error_value,
File "/home/airflow/.local/lib/python3.7/site-packages/snowflake/connector/errors.py", line 324, in hand_to_other_handler
cursor.errorhandler(connection, cursor, error_class, error_value)
File "/home/airflow/.local/lib/python3.7/site-packages/snowflake/connector/errors.py", line 210, in default_errorhandler
cursor=cursor,
snowflake.connector.errors.ProgrammingError: 001003 (42000): 01a6551a-0501-b736-0251-83014fb1394b: SQL compilation error:
syntax error line 3 at position 34 unexpected '<EOF>'.
Variable should be defined(:= is assignment operator and can be later accessed:
Test table:
create or replace table proc_runtimes(env TEXT,
app TEXT,
task_name TEXT,
start_time timestamp_ntz(9),
end_time timestamp_ntz(9),
duration TEXT,
message TEXT);
Main block:
BEGIN
let app := 'abc';
let env := current_database();
let start_time := current_timestamp()::timestamp_ntz(9);
let end_time := current_timestamp()::timestamp_ntz(9);
let duration := DATEDIFF(seconds, end_time, start_time);
let log_message := 'some log';
INSERT INTO proc_runtimes
(env, app, task_name, start_time, end_time, duration, message)
SELECT
:env, :app, 'Job Start', :start_time, :end_time, :duration, :log_message;
END;
Check:
SELECT * FROM proc_runtimes;
Resolved the issue with below statement
execute immediate $$
BEGIN
....
....
....
END;
$$

Airflow:SnowflakeOperator:Last query id returned to XCOM

I have a snowflake file with a query like as below, in the snowflake operator if I have a return so that I can pass xcom to the next task.
How can I get only the last query id to be returned for xcom ? Basically I need to get the snowflake last query id to xcom
In SQL File:
select columns from tableA ;
select last_query_id();
Error : Multiple SQL statements in a single API call are not supported; use one API call per statement instead.
or is there a way I can get below query id returned to xcom
Code:
class LastQueryId(SnowflakeOperator):
def execute(self, context: Any) -> None:
"""Run query on snowflake"""
self.log.info('Executing: %s', self.sql)
hook = SnowflakeHook(snowflake_conn_id=self.snowflake_conn_id,
warehouse=self.warehouse, database=self.database,
role=self.role, schema=self.schema, authenticator=self.authenticator)
result = hook.run(self.sql, autocommit=self.autocommit, parameters=self.parameters)
self.query_ids = hook.query_ids
if self.do_xcom_push and len(self.query_ids) > 0:
return self.query_ids[-1]
UPDATED: I was able to get the query id of the snowflake with above code but in the log, I also see the result of the query, how can I avoid those in the log
[2022-06-23, 20:43:39 UTC] {cursor.py:696} INFO - query: [SELECT modifieddate, documentdate...]
[2022-06-23, 20:43:40 UTC] {cursor.py:720} INFO - query execution done
[2022-06-23, 20:43:56 UTC] {snowflake.py:307} INFO - Statement execution info - {'MODIFIEDDATE': datetime.datetime(2022, 6, 23, 11, 42, 34, 233000), 'DOCUMENTDATE': datetime.datetime(2015, 10, 1, 0, 0)...}
[2022-06-23, 20:43:56 UTC] {snowflake.py:307} INFO - Statement execution info - {'MODIFIEDDATE': datetime.datetime(2022, 6, 23, 13, 50, 45, 377000), 'DOCUMENTDATE': datetime.datetime(2021, 7, 1, 0, 0)...}
[2022-06-23, 20:43:56 UTC] {snowflake.py:307} INFO - Statement execution info - {'MODIFIEDDATE': datetime.datetime(2022, 6, 23, 11, 36, 51, 583000), 'DOCUMENTDATE': datetime.datetime(2015, 8, 31, 0, 0)...}
....
....
....
[2022-06-23, 20:43:56 UTC] {snowflake.py:311} INFO - Rows affected: 22116
[2022-06-23, 20:43:56 UTC] {snowflake.py:312} INFO - Snowflake query id: 01a5259b-0501-98f3-0251-830144baa623
[2022-06-23, 20:43:56 UTC] {connection.py:509} INFO - closed
SnowflakeOperator already store the query_ids but it does not push them to xcom.
You can create a custom operator as:
class MySnowflakeOperator(SnowflakeOperator):
def execute(self, context: Any) -> None:
"""Run query on snowflake"""
self.log.info('Executing: %s', self.sql)
hook = self.get_db_hook()
execution_info = hook.run(self.sql, autocommit=self.autocommit, parameters=self.parameters)
self.query_ids = hook.query_ids
if self.do_xcom_push and len(self.query_ids) > 0:
return self.query_ids[-1] # last query_id
If you want to maintain the original operator functionality then you can do:
class MySnowflakeOperator(SnowflakeOperator):
def execute(self, context: Any) -> None:
parent_return_value = super().execute(context)
if self.do_xcom_push and len(self.query_ids) > 0:
self.xcom_push(
context,
key="last_query_id",
value=self.query_ids[-1],
)
return parent_return_value

Airflow PythonOperator template_dict raises error TemplateNotFound(template)

I'm trying to pass bar.sql through the PythonOperator's template_dict for use in the python_callable, like the docs mention, but this is the closest example I've found. I've also reviewed this question which references Airflow 1.8, but the solution did not work for me in practice - I'm using Airflow 2.2.4.
(Also, there seems to be a well known BashOperator issue (question and docs references) where TemplateNotFound errors are common. For the BashOperator, you can troubleshoot by changing command='script.sh' to command='script.sh ', but I did not have any such luck using this with my .sql file passed to PythonOperator's template_dict.)
My task below is resulting in the logs raising an error TemplateNotFound(template): bar.sql
with DAG(
'bigquery-dag',
default_args=default_args,
schedule_interval=timedelta(days=1),
start_date=datetime(2021, 1, 1),
catchup=False,
template_searchpath=['usr/local/airflow/include']
) as dag:
start = DummyOperator(
task_id='start',
on_success_callback=some_other_function
)
t1 = PythonOperator(
task_id='sql_printer',
python_callable=sqlPrinter,
templates_dict={'sql': 'bar.sql'},
templates_exts=['.sql',],
provide_context=True
)
start >> t1
My goal is for bar.sql to be available for use in sqlPrinter
-- ~/include/bar.sql
select 'hello world'
def sqlPrinter(**context):
print(f"sql: {context['templates_dict']['sql']}")
The result I would like to see is
>>> sql: select 'hello world'
Below is the DAG and error log from sql_printer the log results.
[2022-04-04, 22:32:29 ] {taskinstance.py:1264} INFO - Executing <Task(PythonOperator): sql_printer> on 2022-04-05 03:32:27.076984+00:00
[2022-04-04, 22:32:29 ] {standard_task_runner.py:52} INFO - Started process 15289 to run task
[2022-04-04, 22:32:29 ] {standard_task_runner.py:76} INFO - Running: ['airflow', 'tasks', 'run', 'bigquery-dag', 'sql_printer', 'manual__2022-04-05T03:32:27.076984+00:00', '--job-id', '456', '--raw', '--subdir', 'DAGS_FOLDER/bigquery-dag.py', '--cfg-path', '/tmp/tmp0fjl_t2a', '--error-file', '/tmp/tmpuy00moli']
[2022-04-04, 22:32:29 ] {standard_task_runner.py:77} INFO - Job 456: Subtask sql_printer
[2022-04-04, 22:32:29 ] {logging_mixin.py:109} INFO - Running <TaskInstance: bigquery-dag.sql_printer manual__2022-04-05T03:32:27.076984+00:00 [running]> on host 1296ec2abf88
[2022-04-04, 22:32:30 ] {taskinstance.py:1718} ERROR - Task failed with exception
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1334, in _run_raw_task
self._execute_task_with_callbacks(context)
File "/usr/local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1423, in _execute_task_with_callbacks
self.render_templates(context=context)
File "/usr/local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 2011, in render_templates
self.task.render_template_fields(context)
File "/usr/local/lib/python3.9/site-packages/airflow/models/baseoperator.py", line 1061, in render_template_fields
self._do_render_template_fields(self, self.template_fields, context, jinja_env, set())
File "/usr/local/lib/python3.9/site-packages/airflow/models/baseoperator.py", line 1074, in _do_render_template_fields
rendered_content = self.render_template(content, context, jinja_env, seen_oids)
File "/usr/local/lib/python3.9/site-packages/airflow/models/baseoperator.py", line 1131, in render_template
return {key: self.render_template(value, context, jinja_env) for key, value in content.items()}
File "/usr/local/lib/python3.9/site-packages/airflow/models/baseoperator.py", line 1131, in <dictcomp>
return {key: self.render_template(value, context, jinja_env) for key, value in content.items()}
File "/usr/local/lib/python3.9/site-packages/airflow/models/baseoperator.py", line 1108, in render_template
template = jinja_env.get_template(content)
File "/usr/local/lib/python3.9/site-packages/jinja2/environment.py", line 997, in get_template
return self._load_template(name, globals)
File "/usr/local/lib/python3.9/site-packages/jinja2/environment.py", line 958, in _load_template
template = self.loader.load(self, name, self.make_globals(globals))
File "/usr/local/lib/python3.9/site-packages/jinja2/loaders.py", line 125, in load
source, filename, uptodate = self.get_source(environment, name)
File "/usr/local/lib/python3.9/site-packages/jinja2/loaders.py", line 214, in get_source
raise TemplateNotFound(template)
jinja2.exceptions.TemplateNotFound: bar.sql
try to use templates_dict={'query': 'bar.sql'}
t1 = PythonOperator(
task_id='sql_printer',
python_callable=sqlPrinter,
templates_dict={'query': 'bar.sql'},
provide_context=True
)
def sqlPrinter(**context):
print(f"sql: {context['templates_dict']['query']}")
the idea is came from this post

why my wordpress post, created programmatically, not opening detail page & redirecting to index instead

I created a post programmatically using python:
sql = "insert into `wp_18_posts` " \
"(`post_author`, `post_date`, `post_date_gmt`, `post_content`, `post_title`, " \
"`post_excerpt`, `post_status`, `comment_status`, `ping_status`, `post_password`, `post_name`," \
"`to_ping`, `pinged`, `post_modified`, `post_modified_gmt`, `post_content_filtered`, `post_parent`," \
"`guid`, `menu_order`, `post_type`, `post_mime_type`, `comment_count`, `django_id`)" \
" values " \
"(%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)"
the post is showing up in database and also in index page, but once I click on that post, it is redirecting me back to index again. So detail page is not opening.
I am setting post_status to 'publish'.
I also set the taxonomy in mysql manually. but no success.
EDIT:
this is the debug.log:
appreciate any hint..
as per debug report you added it says something table wp_9_option does't exist where you are using prefix 'wp_18'

boto - delete fails because of schema mismatch

I have a table called Events, with deviceID as a primary key, and timeStamp as a sort key. Now I'm trying to delete an item given both of these keys:
dynamodb = boto3.resource('dynamodb')
events_table = dynamodb.Table('Events')
events_table.delete_item(
Key = {
'deviceID' : 'xyz123',
'timeStamp' : 12314156.54345
}
)
Why am I getting a schema mismatch error? Output below:
File "C:\Python27\lib\site-packages\boto3\resources\factory.py", line 498, in do_action
response = action(self, *args, **kwargs)
File "C:\Python27\lib\site-packages\boto3\resources\action.py", line 83, in __call__
response = getattr(parent.meta.client, operation_name)(**params)
File "C:\Python27\lib\site-packages\botocore\client.py", line 236, in _api_call
return self._make_api_call(operation_name, kwargs)
File "C:\Python27\lib\site-packages\botocore\client.py", line 500, in _make_api_call
raise ClientError(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (ValidationException) when calling the DeleteItem operation:
The provided key element does not match the schema
according to documentation:
client = boto3.client('dynamodb')
client.delete_item(TableName='tbl_name',
Key={
'deviceID':{'S':'xyz123'},
'timeStamp' : '12314156.54345'
})

Resources