I am using the native Wordpress function wp_nav_menu() to create my site's navigation menus. This function really takes a long time to work, especially if the navigational menus is large like mine is. So my thought to get around this is as follows:
session_start();
if(isset($_SESSION['topTranslucent']))
echo $_SESSION['topTranslucent'];
else {
// ob necessary because wp_nav_menu() echos it's results
ob_start();
wp_nav_menu(array('menu'=>'Top Translucent','container'=>'','menu_id'=>'topMenu'));
$_SESSION['topTranslucent'] = ob_get_contents();
ob_end_flush();
}
My thinking here is that it will be much faster to print out the html stored in the session variable than to rerun the function on every page load. But not being too experienced with php sessions, I wanted to get some expert opions from you lovely wunderkinds at StackOverflow. Question is: Are sessions actually just doing what they seem to be doing? (i.e. storing text data in a cookie to be used across pages), or is there more than meets the eye?
Sessions are storing the serialized data on the server; they use cookies to for identification only. Example:
Client:
cookie { PHPSESSID => '1234567890a' }
Server:
cookie { PHPSESSID => '1234567890a' }
=> session 1234567890a {
topTranslucent => '<yourcode>whatever</yourcode>'
}
Your approach could work; note that the whole session will be unserialized on load (so overusing this will slow down the system, as it will load a lot of data. Using this for a few small snippets should be OK).
Possibly a better approach would be using a different mechanism as a cache, but sessions-as-a-cache are somewhat usable.
Related
I had to add a WordPress installation to my CodeIgniter system, so I've put it in a submap called blog and excepted that folder in my .htaccess. All good and well.
I've put the all WordPress tables together with in my CodeIgniter databases with prefix _wp.
I've now loaded the WordPress blog header file into the index.php of CodeIgniter, like so;
require('blog/wp-blog-header.php');
add_filter('site_url', 'ci_site_url', 1);
function ci_site_url() {
include(FCPATH.'/application/config/config.php');
return $config['base_url'];
}
And made a registration method in my Account controller to make an actual link to my Customers. I do this because I want to make the WordPress login/registration obsolete and solely control that from the CodeIgniter login page;
protected function register_wp($email_address = FALSE) {
if ($email_address !== FALSE) {
if (username_exists( $email_address ) == NULL) {
$password = wp_generate_password(12, TRUE);
$user_id = wp_create_user($email_address, $password, $email_address);
wp_update_user(array(
'ID' => $user_id,
'nickname' => $email_address
));
$user = new WP_User($user_id);
$user->set_role('subscriber');
$login_data = array(
'user_id' => $user_id,
'password' => $password,
);
return $login_data;
}
else {
// User already exists with that email address
return FALSE;
}
}
else {
// No email_address given
return FALSE;
}
}
And the login method, to give an idea;
protected function login_wp($user_id = FALSE) {
if ($user_id !== FALSE) {
$user_login = 'admin';
$user = get_userdatabylogin($user_login);
$user_id = $user->ID;
wp_set_current_user($user_id, $user_login);
wp_set_auth_cookie($user_id);
do_action('wp_login', $user_login);
}
else {
// No user_id given
return FALSE;
}
}
All still going well. But here comes the clash; something I was very sad about because everything worked very well up until now:
WordPress overtakes the session and kills CodeIgniter's session.
I already tried tons of things;
session_rename('PHPSESSIDWP'); and then starting another session (with another name) for CodeIgniter after WordPress was loaded
COOKIE path (I'm not 100% sure if I done this right, as it didn't change at all. Read some things online it doesn't work well in all browsers either)
COOKIE domain (seemed to have no effect)
The problem is I can't load the require('blog/wp-blog-header.php'); only in the controller method, as I need to be able to control the logged in state of the WordPress part. Besides that I will get complaints about the site_url() function, that's already claimed by the URL helper.
I think the problem is mainly because both CodeIgniter and WordPress use their own unique way of handling Sessions (CI in the Database and WordPress in "super globals") which probably only makes them use the cookie to remember a "state".
My whole CodeIgniter system already runs on the Database-driven Session models so that's an absolute no-go to make a switch. For WordPress it seems it can't even work with session anymore with it's code features (I know session "do" work, but that doesn't seem to count in any way for the WP core system).
Also I quoted out wp_unregister_GLOBALS(); in the wp-settings.php file.
Plus that I also tried to rename my session COOKIE name in CodeIgniter to use something like session_ci
I really hope someone knows a way to being able to tell CodeIgniter or WordPress to only update their values and don't kill the whole session each time. I also read something about splitting up cookies with .htaccess but can't find good resources on it. So if anyone knows how to do that, I would be eternally grateful.
I'm in despair. Finishing it for 98% and then getting such a letdown in the end :(..
Update
Maybe I can do something in the WordPress section that handles the cookies?
http://codex.wordpress.org/Function_Reference/wp_set_auth_cookie
Sadly I'm not really home in the WordPress world. I solely have to use it this one time due to the bought template that the people really wanted to use in the blog.
Also this page states the following;
WordPress uses the two cookies to bypass the password entry portion of wp-login.php. If WordPress recognizes that you have valid, non-expired cookies, you go directly to the WordPress Administration interface. If you don't have the cookies, or they're expired, or in some other way invalid (like you edited them manually for some reason), WordPress will require you to log in again, in order to obtain new cookies.
I wonder tho, how to bypass that "invalid" check, which probably is the reason it kills the CodeIgniter cookie(s)? Weirdly enough tho, it seems the session_ci value stays, although the session still seems killed.
You need to put your session start at the very top of config.php.
This is the only place a session will not be destroyed by WordPress.
if (!session_id())
session_start();
If your PHP installation does not have register_global enabled, the
above code should allow you to use session, however, if it does, you
will not be able to get the data that was set in previous request.
This is because WordPress will destroy all data contained inside
session variable when it does the initialization.
Here's why and troubleshooting on this -> kanasolution.com
EXPANDED ANSWERS:
Source: http://codex.wordpress.org/WordPress_Cookies
On login, wordpress uses the wordpress_[hash] cookie to store your
authentication details. Its use is limited to the admin console area,
/wp-admin/
After login, wordpress sets the wordpress_logged_in_[hash] cookie, which indicates when you're logged in, and who you are, for
most interface use.
So WordPress clearly dislikes the way that you're writing cookies, maybe their lack of 8 pass MD5 hash etc? WordPress encryption methods
The WordPress Environment
The next thing I would try is integrating your custom login page into the WordPress environment instead of just requiring the header. (lets stay away from editing core)
From WordPress & AJAX by Ronald Huereca page 78 explains manually loading the WordPress environment.
The use of the dirname functions depend on the hierarchy of your file. Adjust them as needed. Code should be used before the tag of your file.
$root = dirname(dirname(dirname(dirname(dirname(__FILE__)))));
if (file_exists($root.'/wp-load.php')) {
require_once($root.'/wp-load.php');
/*Run custom WordPress stuff here */
//Output header HTML, queue scripts and styles, and include BODY content
wp_enqueue_script('my_script', get_stylesheet_directory_uri() . '/my_script.js', array('jquery'), '1.0.0');
wp_print_scripts(array('my_script'));
}
We are working on a PROXY based protection software. It catches the user http request, do the proxy stuffs, and catch the http response, modify its content and send it back to the original user.
We had 2 tries:
SQUID proxy and a PHP rind out of SQUID.
It was promising, but at PHP stream we did not know about the length of response data we were expected, so it was timeouting every time => SLOW
Now, we wrote a .net application. It does everything we need, and its pretty fast even does not modify the content. If we need to GZIP/GUNZIP, or just modify the content, it becomes very slow.
Could you help us?
We are working on this project for almost a year in our University in Hungary. We wrote a automatic, self learning full semantical analizer engine, which can analyze and interpret in all language, and can detect and screen the target content. We also built an image recognition software, which can detect the target object in 90% confidence in all image.
So everything is ready, but our proxy application is stucked.
We also could pay for this job, if anybody would write it.
I spend a lot of time programming in PHP - yes, as an interpreted language it can be slow - and there is a huge amount of badly written code available - but even before you start to touch the code, tuning the environment can reduce execution time by a factor of 5-10. Then changing the code can make it go faster still; the biggest wins come from good choices for architecture and data structures (which is true of any language - not just PHP).
I don't know where you're starting from but find it surprising that you are not able to process the stream relative to the amount of time taken to generate the content and send it across the network. For it to be timing out something is very wrong. (you're not trying to parse the HTML using the one of the XML parsers are you?). The length of the content shoul have little impact on the performance of the script unless you are trying to map it all into PHP's address space at the same time.
However AFAIK, it's not possible to implement a content filter directly into Squid using PHP (if you did, I'd love to know how you did it, also if you've implemented ICAP, that's very interesting). I'm guessing that you are using a URL redirector to route the requests via a proxy script written in PHP.
It is possible to write an ECAP module in C/C++.
Image recognition and natural language processing are not trivial exercises in programming - so you must have some good programmers working on your team. Really addressing your problem goes rather beyond the scope of a stack overflow answer, and touting for contractors is definitely off topic.
Thanks for your reply!
First of all: our PHP is pretty fast, the fsockopen is slow, because it cannot know when to close the response connection from SQUID.
Here is our code:
$buffer = socket_read($client, 4096);
if ( !($handle = fsockopen(HOST, SQUIDPROXYPORT, $errno, $error, 1)) ) {
Log::write($this->log, 'Errno: ' . $errno . ' Error: ' . $error . "\n" . $buffer);
exit('Nem sikerült csatlakozni! ' . $errno . ':' . $error);
}
stream_set_timeout($handle, 0, 100000);
fwrite($handle, $buffer);
$result = '';
do {
$tmp = fgets($handle, 1024);
if ( $tmp ) {
$result .= $tmp;
}
} while ( !feof($handle) && $tmp != false );
socket_write($client, $result, strlen($result));
fclose($handle);
socket_close($client);
Again, how it works:
Client send HTTP request to us
Our PHP get the request, and send its header to SQUID proxy
Squid does its stuff, and send the response data back to our PHP
Our PHP gets by fsockopen the response data from squid
We analyze the response data, or modify it
We send it back to client
BUT:
While we are waiting for the response data, we receive it, but we cannot know, at what time to close the connection between our PHP and SQUID. This results a slow work, and timeout at almost every time.
If you have any idea, plesa share with us!
I am looking for more detailed information on how I can get the following caching behavior in Drupal 7.
I want a block that renders information I'm retrieving from an external service. As the block is rendered for many users I do not want to continually request data from that service, but instead cache the result. However, this data is relatively frequent to change, so I'd like to retrieve the latest data every 5 or 10 minutes, then cache it again.
Does anyone know how to achieve such caching behavior without writing too much of the code oneself? I also haven't found much in terms of good documentation on how to use caching in Drupal (7), so any pointers on that are appreciated as well.
Keep in mind that cache_get() does not actually check if an item is expired or not. So you need to use:
if (($cache = cache_get('your_cache_key')) && $cache->expire >= REQUEST_TIME) {
return $cache->data;
}
Also make sure to use the REQUEST_TIME constant rather than time() in D7.
The functions cache_set() and cache_get() are what you are looking for. cache_set() has an expire argument.
You can use them basically like this:
<?php
if ($cached_data = cache_get('your_cache_key')) {
// Return from cache.
return $cached_data->data;
}
// No or outdated cache entry, refresh data.
$data = _your_module_get_data_from_external_service();
// Save data in cache with 5min expiration time.
cache_set('your_cache_key', $data, 'cache', time() + 60 * 5);
return $data;
?>
Note: You can also use a different cache bin (see documentation links) but you need to create a corresponding cache table yourself as part of your schema.
I think this should be $cache->expire, not expires. I didn't have luck with this example if I'm setting REQUEST_TIME + 300 in cache_set() since $cache->expires will always be less than REQUEST_TIME. This works for me:
if (($cache = cache_get('your_cache_key', 'cache')) && (REQUEST_TIME < $cache->expire)) {
return $cache->data;
}
I have a page that displays some data. The source of the data is not Drupal nodes, so Views is of no use me:
function mymodule_main_page($arg1, $arg2, $arg3) {
$results = call_remote_api_and_get_lots_of_results($arg1, $arg2, $arg3);
return theme('mymodule_page', $results, $arg1, $arg2, $arg3);
}
My module also displays a block. The block purpose is to summarize the the results that were returned in the main page content (eg: Number of results: X, Number of pages: Y, etc)
/**
* Implementation of hook_block().
*/
function mymodule_block($op = 'list', $delta = 0, $edit = array()) {
switch ($op) {
case 'view':
if ($delta == 0) {
$block['subject'] = t('Results summary');
$block['content'] = theme('mymodule_results_summary');
}
break;
}
}
I need to avoid generating the results again. What is the best way for my block to access the results object returned in the function that drew the main page? Global or Static vars? Is there a module that exists that already attempts to solve this problem?
Very good and flexible solution is using drupal core functions cache_set and cache_get as ya.teck mentioned but extend its functionality with cacherouter module. You can specify cache storage engines and use memcache or shared memory for you cache. It doesn't use database for storing data and very fast.
In addition to the cache system that ya.teck mentions, a more simple way is to cache the entire block for x mins, hours, days. Drupal has a built in cache system for all blocks. You can see some of the settings at admin/settings/performance
Update:
The drupal way both core and contrib is to use a static variable an array or the actual variable and store the heavy lifting there. An example could be node_load, which stores all of the loaded nodes in an array so each node only needs to be loaded once during each request.
You may store your data by drupal cache system.
See cache_set and cache_get functions for more information.
I'm working on a content dripper custom plugin in WordPress that my client asked me to build. He says he wants it to catch a page view event, and if it's the right time of day (24 hours since last post), to pull from a resource file and output another post. He needed it to also raise a flag and prevent other sessions from firing that same snippet of code. So, raise some kind of flag saying, "I'm posting that post, go away other process," and then it makes that post and releases the flag again.
However, the strangest thing is occurring when placed under load with multiple sessions hitting the site with page views. It's firing instead of one post -- it's randomly doing like 1, 2, or 3 extra posts, with each one thinking that it was the right time to post because it was 24 hours past the time of the last post. Because it's somewhat random, I'm guessing that the problem is some kind of write caching where the other sessions don't see the raised flag just yet until a couple microseconds pass.
The plugin was raising the "flag" by simply writing to the wp_options table with the update_option() API in WordPress. The other user sessions were supposed to read that value with get_option() and see the flag, and then not run that piece of code that creates the post because a given session was already doing it. Then, when done, I lower the flag and the other sessions continue as normal.
But what it's doing is letting those other sessions in.
To make this work, I was using add_action('loop_start','checkToAddContent'). The odd thing about that function though is that it's called more than once on a page, and in fact some plugins may call it. I don't know if there's a better event to hook. Even still, even if I find an event to hook that only runs once on a page view, I still have multiple sessions to contend with (different users who may view the page at the same time) and I want only one given session to trigger the content post when the post is due on the schedule.
I'm wondering if there are any WordPress plugin devs out there who could suggest another event hook to latch on to, and to figure out another way to raise a flag that all sessions would see. I mean, I could use the shared memory API in PHP, but many hosting plans have that disabled. Can't use a cookie or session var because that's only one single session. About the only thing that might work across hosting plans would be to drop a file as a flag, instead. If the file is present, then one session has the flag. If the file is not present, then other sessions can attempt to get the flag. Sure, I could use the file route, but it's kind of immature in my opinion and I was wondering if there's something in WordPress I could do.
The key may be to create a semaphore record in the database for the "drip" event.
Warning - consider the following pseudocode - I'm not looking up the functions.
When the post is queried, use a SQL statement like
$ts = get_time_now(); // or whatever the function is
$sid = session_id();
INSERT INTO table (postcategory, timestamp, sessionid)
VALUES ("$category", $ts, "$sid")
WHERE NOT EXISTS (SELECT 1 FROM table WHERE postcategory = "$category"
AND timestamp < $ts - 24 hours)
Database integrity will make this atomic so only one record can be inserted.
and the insertion will only take place if the timespan has been exceeded.
Then immediately check to see if the current session_id() and timestamp are yours. If they are, drip.
SELECT sessionid FROM table
WHERE postcategory = "$postcategory"
AND timestamp = $ts
AND sessionid = "$sid"
The problem goes like this with page requests even from the same session (same visitor), but also can occur with page requests from separate visitors. It works like this:
If you are doing content dripping, then a page request is probably what you intercept with add_action('wp','myPageRequest'). From there, if a scheduled post is due, then you create the new post.
The post takes a little bit of time to write to the database. In that time, a query on get_posts() may not see that new record yet. It may actually trigger your piece of code to create a new post when one has already been placed.
The fix is to force WordPress to flush the write cache appears to be this:
try {
$asPosts = array();
$asPosts = # wp_get_recent_posts(1);
foreach($asPosts as $asPost) {break;}
# delete_post_meta($asPost['ID'], '_thwart');
# add_post_meta($asPost['ID'], '_thwart', '' . date('Y-m-d H:i:s'));
} catch (Exception $e) {}
$asPosts = array();
$asPosts = # wp_get_recent_posts(1);
foreach($asPosts as $asPost) {break;}
$sLastPostDate = '';
# $sLastPostDate = $asPost['post_date'];
$sLastPostDate = substr($sLastPostDate, 0, strpos($sLastPostDate, ' '));
$sNow = date('Y-m-d H:i:s');
$sNow = substr($sNow, 0, strpos($sNow, ' '));
if ($sLastPostDate != $sNow) {
// No post today, so go ahead and post your new blog post.
// Place that code here.
}
The first thing we do is get the most recent post. But we don't really care if it's not the most recent post or not. All we're getting it for is to get a single Post ID, and then we add a hidden custom field (thus the underscore it begins with) called
_thwart
...as in, thwart the write cache by posting some data to the database that's not too CPU heavy.
Once that is in place, we then also use wp_get_recent_posts(1) yet again so that we can see if the most recent post is not today's date. If not, then we are clear to drip some content in. (Or, if you want to only drip in like every 72 hours, etc., you can change this a little here.)