Getting HTML DOM from websites that work with client-side (JavaScript)
It doesn't work because
<div class="currTableContainer" id="divCurrTableContainer">
is empty. There is nothing in it. You wont find a table in it. Hence the nullpointer. The HTML you're looking for is generated by a script(ajax).
Check the loaded HTML to confirm.
Because I was bored I decided to finish your homework. You can't use HAP in this case. You have to use the ajax service they provided. Also better since the data is prone to change a lot. The code that is used on the site(ajax script) looks like this:
$("document").ready(function () {
$.ajax({
type: "POST",
url: '/_layouts/15/LINKDev.CIB.CurrenciesFunds/FundsCurrencies.aspx/GetCurrencies',
async: true,
data: "{'lang':'" + document.getElementById("ctl00_ctl48_g_5d7fc52f_a66d_4aa2_8d6c_c01fb4b38cb2_hdnLang").value + "'}",
contentType: "application/json; charset=utf-8",
dataType: "json",
success: function (msg) {
if (msg.d != null && msg.d.length > 0) {
var contentHTML = "<table class='currTable' cellspacing='0' rules='all' style='border-width:0px;border-collapse:collapse;'>"
+ "<tbody><tr class='currHeaderRow'>"
+ "<th scope='col'>" + document.getElementById("ctl00_ctl48_g_5d7fc52f_a66d_4aa2_8d6c_c01fb4b38cb2_hdnCurrency").value + "</th><th scope='col'>" + document.getElementById("ctl00_ctl48_g_5d7fc52f_a66d_4aa2_8d6c_c01fb4b38cb2_hdnBuy").value + "</th><th scope='col'>" + document.getElementById("ctl00_ctl48_g_5d7fc52f_a66d_4aa2_8d6c_c01fb4b38cb2_hdnSell").value + "</th>"
+ "</tr>";
for (var i = 0; i < msg.d.length; i++) {
if (msg.d[i].CurrencyID.length > 0) {
contentHTML += "<tr class='currRow'>"
+ "<td>" + msg.d[i].CurrencyID + "</td><td>" + msg.d[i].BuyRate + "</td><td class='lastCell'>" + msg.d[i].SellRate + "</td>"
+ "</tr>";
}
}
contentHTML += "</tbody></table>";
$("#divCurrTableContainer").html(contentHTML);
if ($(".bannerElements").length > 0)
FixCurrenciesRatesScroll();
}
},
error: function (msg) {
}
});
});
As you can see the script uses an URL from a different part of the site to update their currency. In order to fetch it you just make http request with the following JSON {"lang":"en"}. I have translated the script into it's equivalent in C# below. The response from the http request will be JSON formatted. Which means you will have to create a class that can be used to serialize the data. I recommend you look at Newtonsoft for this since I can't do all your homework.
using System.Text;
using System.Threading.Tasks;
using System.IO;
using System.Net;
namespace test
{
class Program
{
public static void Main(string[] args)
{
try
{
string webAddr = "http://www.cibeg.com/_layouts/15/LINKDev.CIB.CurrenciesFunds/FundsCurrencies.aspx/GetCurrencies";
var httpWebRequest = (HttpWebRequest)WebRequest.Create(webAddr);
httpWebRequest.ContentType = "application/json";
httpWebRequest.Method = "POST";
httpWebRequest.Expect = "application/json";
string datastring = "{\"lang\":\"en\"}";
ASCIIEncoding encoder = new ASCIIEncoding();
byte[] data = encoder.GetBytes(datastring);
httpWebRequest.ContentLength = data.Length;
httpWebRequest.GetRequestStream().Write(data, 0, data.Length);
var httpResponse = (HttpWebResponse)httpWebRequest.GetResponse();
using (var streamReader = new StreamReader(httpResponse.GetResponseStream()))
{
var responseText = streamReader.ReadToEnd();
Console.WriteLine(responseText);
//Now you have your response.
//or false depending on information in the response
}
}
catch (WebException ex)
{
Console.WriteLine(ex.Message);
}
Console.ReadKey();
}
Related
Im trying to catch an error when uploading a file that is too large.
At the server, the multipart section reader throws which I catch and return as an BadRequest (also tried InternalError) :
try
{
var section = await reader.ReadNextSectionAsync();
while (section != null)
{
...
}
}
catch (Exception ex)
{
return StatusCode((int)HttpStatusCode.BadRequest, ProblemFactory.Shared.BadRequestProblem("Could not upload file", ex.Message));
}
To upload, I have the following (using RestSharp currently, but same result with HttpClient via HttpClientFactory):
var request = new RestRequest(REQ_UPLOADFILE, Method.POST, DataFormat.Json);
var token = await _agentTokenService.GetToken();
AddTokenHeader(request, token.AccessToken);
request.AddFile("file", path);
request.AddParameter("externalFileType", fileType, ParameterType.GetOrPost);
request.AddParameter("subType", subType, ParameterType.GetOrPost);
var resp = await _client.ExecuteTaskAsync(request);
if (resp.IsSuccessful)
{
return JsonConvert.DeserializeObject<ExternalFileResponse>(resp.Content);
}
else
{
string reason = "unknown error";
//switch(resp.StatusCode)
//{
// case HttpStatusCode.???
//}
throw new Exception($"Could not upload file: {reason}");
}
The response from the post is status code 0 with a message:
The stream does not support concurrent IO read or write operations
The upload is running in a task so I guess it's something to do with that, but there is only a single download running and if the file is smaller it works without a problem.
I can only think that something in the response handling is fracking with this somehow.
Does anyone have a clue??
Thanks and Merry Christmas :)
PS: Im using Kestrel only - no IIS or nginx - with options thus:
.UseKestrel(options =>
{
options.AddServerHeader = false;
options.Limits.MaxRequestBodySize = 100 * 1024 * 1024; // 100 MB
})
UPDATE
I think I understand this better now.
The server is terminating the connection when reading the bytes form the stream.
The client, writing asyncly, continues for a little bit, but I then try to read the response - hence the error message.
UPDATE
HttpClient does actually respond a little differently - I get my badrequest and the message is that the stream was closed.
here is a simple way to loading large file use ASP.NET.
(1)downloand webupload from https://github.com/fex-team/webuploader
(2) webupload can upload large file to chunk file,crate a webform
<!DOCTYPE html>
<html xmlns="http://www.w3.org/1999/xhtml">
<head runat="server">
<title></title>
<script src="../Javascript/jquery-1.9.1.min.js"></script>
<link href="../javascript/webuploader/webuploader.css" rel="stylesheet" />
<script src="../javascript/webuploader/webuploader.js"></script>
</head>
<body >
<form>
<br /><br />
<div id="uploader" class="wu-example">
<div class="btns">
<div id="picker" style="display:inline-block">Select a File</div>
<input id="ctlBtn" type="button" value="Upload" class="btn btn-primary" style="border-radius:0px; position:relative; top:-13px"/>
</div>
<div id="thelist" class="uploader-list"></div>
</div>
<script>
_extensions = '3gp,mp4,rmvb,mov,avi,m4v';
_mimeTypes = 'video/*,audio/*,application/*';
var GUID = WebUploader.Base.guid();//一个GUID
// alert(GUID);
uploader = WebUploader.create({
auto: false,
// swf path
swf: '../javascript/webuploader/Uploader.swf',
//server receive data
server: '_uploadVideo.aspx',
pick: {
id: '#picker',
label: 'Select a File',
innerHTML: 'Select a File',
multiple: false
},
fileNumLimit: 1,
fileSingleSizeLimit: 1024 * 1024 * 120,
accept: {
title: 'large file',
extensions: _extensions,
mimeTypes: _mimeTypes, // eg. image/*,
},
chunked: true,//split large into small file
chunkSize: 1024 * 1024 * 2, //every small file size,this is 2M
formData: {
guid: GUID,
types:"upload"
}
});
uploader.on('fileQueued', function (file) {
$("#thelist").append('<div id="' + file.id + '" class="item">' +
'<b class="info">' + file.name + '</b> ' +
'<p class="state">wait...</p>' + '</div>');
});
uploader.on('uploadSuccess', function (file, response) {
//merge small into a large file
$('#' + file.id).find('p.state').html('<font color=green>upload success</font>');
$.post('_uploadVideo.aspx', { guid: GUID, fileName: file.name, types: "merge" ,r:Math.random(),itemid:<%=Request.QueryString["itemid"]%> },
function (data) {
if (data == 1) {
alert("success");
}
else {
alert("fail");
}
});
});
uploader.on("error", function (type, handler) {
if (type == "Q_TYPE_DENIED") {
alert("format error");
} else if (type == "F_EXCEED_SIZE") {
alert("size too large");
}
});
uploader.on('uploadProgress', function (file, percentage) {
var $li = $('#' + file.id),
$percent = $li.find('.progress .progress-bar');
// stop re-upload
if (!$percent.length) {
$percent = $('<div class="progress progress-striped active">' +
'<div class="progress-bar" role="progressbar" style="width: 0%">' +
'</div>' +
'</div>').appendTo($li).find('.progress-bar');
}
$li.find('p.state').text('uploading');
$percent.css('width', percentage * 100 + '%');
});
$("#ctlBtn").click(
function () {
uploader.upload();
}
);
</script>
</form>
</body>
</html>
(3)create a asp.net webform page _uploadVideo.aspx
delete all content,the aspx should only has 1 line
<%# Page Language="C#" AutoEventWireup="true" CodeFile="_uploadVideo.aspx.cs" Inherits="Gallery.Gallery._uploadVideo" %>
(4)in code _uploadVideo.aspx.cs wirte code below:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;
using System.Web.UI;
using System.Web.UI.WebControls;
using System.IO;
using System.Data.SqlClient;
namespace Gallery.Gallery
{
public partial class _uploadVideo : System.Web.UI.Page
{
protected void Page_Load(object sender, EventArgs e)
{
if (Request["types"] == "upload")
{
if (Request.Form.AllKeys.Any(m => m == "chunk"))
{
int year = DateTime.Now.Year;
//取得chunk和chunks
int chunk = Convert.ToInt32(Request["chunk"]);
int chunks = Convert.ToInt32(Request["chunks"]);
string folder = Server.MapPath("../uploads/video/"+year+"/" + Request["guid"] + "/");
string path = folder + chunk;
//建立临时传输文件夹 create teamplate folder
if (!Directory.Exists(Path.GetDirectoryName(folder)))
{
Directory.CreateDirectory(folder);
}
FileStream addFile = new FileStream(path, FileMode.Append, FileAccess.Write);
BinaryWriter AddWriter = new BinaryWriter(addFile);
//获得上传的分片数据流 get chunk
var file = Request.Files[0];
Stream stream = file.InputStream;
BinaryReader TempReader = new BinaryReader(stream);
//将上传的分片追加到临时文件末尾
AddWriter.Write(TempReader.ReadBytes((int)stream.Length));
//关闭BinaryReader文件阅读器
TempReader.Close();
stream.Close();
AddWriter.Close();
addFile.Close();
TempReader.Dispose();
stream.Dispose();
AddWriter.Dispose();
addFile.Dispose();
string f_ext = Path.GetExtension(file.FileName);
string _result = "{\"chunked\" :\"true\",\"hasError\" :\"false\",\"f_ext\" :\"" + f_ext + "\" }";
System.Web.HttpContext.Current.Response.Write(_result);
}
}
if (Request["types"] == "merge")
{
try
{
int year = DateTime.Now.Year;
var guid = Request["guid"];//GUID
var uploadDir = Server.MapPath("../uploads/video/" + year+"/");//Upload 文件夹
//建立临时传输文件夹
if (!Directory.Exists(Path.GetDirectoryName(uploadDir)))
{
Directory.CreateDirectory(uploadDir);
}
var dir = Path.Combine(uploadDir, guid);//临时文件夹
var ext = Path.GetExtension(Request["fileName"]);
var files = Directory.GetFiles(dir);//获得下面的所有文件
var name = Guid.NewGuid().ToString("N") + ext;
var finalPath = Path.Combine(uploadDir, name);//最终的文件名
var fs = new FileStream(finalPath, FileMode.Create);
foreach (var part in files.OrderBy(x => x.Length).ThenBy(x => x))//排一下序,保证从0-N Write
{
var bytes = System.IO.File.ReadAllBytes(part);
fs.Write(bytes, 0, bytes.Length);
bytes = null;
System.IO.File.Delete(part);//删除分块
}
fs.Flush();
fs.Close();
Directory.Delete(dir);//删除文件夹
//INSERT INTO DB finalPath
SqlParameter[] p = {
new SqlParameter("#videopath","../uploads/video/" + year+"/"+name)
};
string sql = #"update portal_photoes set videopath=#videopath where id=" + int.Parse(Request["itemid"]);
//you can exe SQL to into database
System.Web.HttpContext.Current.Response.Write("1");
}
catch (Exception ex)
{
System.Web.HttpContext.Current.Response.Write("0");
}
}
}
}
}
I creating a custom MultipartStreamProvider to store files in Azure File Storage as part of a Lift & Shift effort of a legacy application. The client is using AngularJS on the front end and WebAPI on the backend. When I am trying to use the MultipartStreamProvider, I need to implement GetStream to return a stream for it to write to. I am using cloudFile.OpenWrite which asks for the size of the stream/file that will be written to it. However, in GetStream, the ContentDisposition.Size is empty. Is there a way I can either make the AngularJS send the content size for each file or on the backend, maybe I can dig the size of the file stream from somewhere else? Any help would be greatly appreciated. Thanks!
MultipartStreamProvider
public override Stream GetStream(HttpContent parent, HttpContentHeaders headers)
{
// For form data, Content-Disposition header is a requirement
ContentDispositionHeaderValue contentDisposition = headers.ContentDisposition;
Console.WriteLine(files.Count);
if (contentDisposition != null)
{
// create default filename if its missing
contentDisposition.FileName = (String.IsNullOrEmpty(contentDisposition.FileName) ? $"{Guid.NewGuid()}.data" : contentDisposition.FileName);
// We won't post process files as form data
_isFormData.Add(false);
CloudMultipartFileData fileData = new CloudMultipartFileData(headers, _fileRepository.BaseUrl, contentDisposition.FileName);// your - aws - filelocation - url - maybe);
_fileData.Add(fileData);
var azureStream = _fileRepository.GetWriteStream(contentDisposition.Size, _relativeDirectory, fileData.FileName);
return azureStream;
// We will post process this as form data
_isFormData.Add(true);
}
throw new InvalidOperationException("Did not find required 'Content-Disposition' header field in MIME multipart body part..");
}
Actual Call to Azure
public override Stream GetWriteStream(long? fileSize, string relativeDirectory, string filename)
{
var combinedRelativeDirectory = GetCloudDirectory(relativeDirectory);
CloudFile cloudFile = combinedRelativeDirectory.GetFileReference(filename);
return cloudFile.OpenWrite(fileSize, null, null);
}
AngularJS File Upload Code
/********************************** Add/Upload Photos **************************************/
$scope.$watch('files', function (files) {
$scope.formUpload = false;
console.log(files);
if (files != null) {
for (var i = 0; i < files.length; i++) {
$scope.errorMsg = null;
(function (file) {
upload(file);
})(files[i]);
}
}
});
function upload(file) {
file.upload = Upload.upload({
url: window.location.origin + "/api/mydocs/uploadfile?storeFolder=" + $scope.attachmentFolder + "&storeId=" + $scope.storeId + "&userId=" + $scope.currentUser.UserId,
method: 'POST',
headers: {},
fields: {},
file: file
});
I wound up manually adding the file size to the header
function upload(file) {
file.upload = Upload.upload({
url: window.location.origin + "/api/mydocs/uploadfile?storeFolder=" + $scope.myFolder + "&clientId=" + $scope.clientId,
method: 'POST',
headers: { 'file-info':file.name + "-/" + file.size },
fields: {},
file: file
});
And then in the constructor, I use that to create a lookup table:
public MyCloudMultipartFormDataStreamProvider(string relativeDirectory, IEnumerable<string> lookupInfo)
{
NewFileNames = new Dictionary<string, string>();
_fileRepository = new CloudFileRepository();
_relativeDirectory = relativeDirectory;
_uploadedFilesLookup = new Dictionary<string, long>();
foreach (var fileInfo in lookupInfo)
{
var values = Regex.Split(fileInfo, #"-/");
_uploadedFilesLookup.Add(values[0], Int64.Parse(values[1]));
}
}
Then grab the file's size out of the lookup table and pass that to my GetWriteStream method
var azureStream = _fileFacade.GetWriteStream(_uploadedFilesLookup[fileName],
_relativeDirectory, fileData.FileName, out newFileName);
im have a weird problem whereby my functionality of exporting to excel doesnt seem to work.
im using J-Query and AJAX to pass html data to a http handler which has some simple context.Response code which all seems fine. Anyway, heres my code:
// my hyperlink for user to click
Click Here to Export
my J-Query/AJAX code
<script type="text/javascript">
$(document).ready(function () {
$("#hyperLink").click(function (e) {
var result = $('#output').html();
var newRes = result.replace('\n', '');
$.ajax({
url: "ExportToExcelhandler.ashx",
data: { 'htmlData': newRes },
dataType: "json",
type: "POST",
success: function (data) {
alert(data);
}
});
});
});
</script>
and my handler:
public void ProcessRequest(HttpContext context)
{
string htmlStuff = context.Request["htmlData"];
string trimStart = "";
string trimEnd = "";
if (htmlStuff != null)
{
trimStart = htmlStuff.Substring(75, htmlStuff.Length - 75);
trimEnd = trimStart.Remove(trimStart.Length - 8, 8) + "";
}
string final= trimEnd;
context.Response.Clear();
context.Response.Buffer = true;
context.Response.AddHeader("content-disposition", "attachment; filename=excelData.xls");
context.Response.ContentType = "application/vnd.ms-excel";
HttpResponse response = context.Response;
context.Response.Output.Write(finalHtmlData);
context.Response.Flush();
context.Response.End();
}
-- Granted, I'm doing some weird things with replace function in my J-Query, and Substring and Remove in my handler; this is because i had to trim my html data so that only the table with the data inside it was included (caused error otherwise). The html data is just report data. So the html data is passed fine to the handler, and it goes through the ProcessRequest method fine, yet doesn't export to excel. Any help would be greatly appreciated, thanks.
Split this into two HTTP handlers, one to generate the Excel data and the second to retrieve the data and have a resource point at it, like this:
GenerateExcelDocument HTTP handler code:
public void ProcessRequest(HttpContext context)
{
string htmlStuff = context.Request["htmlData"];
var docIdentifier = this.GenerateExcelDocument(htmlStuff);
context.Response.ContentType = "text/plain";
context.Response.Write(docIdentifier.ToString("N"));
}
private Guid GenerateExcelDocument()
{
var identifier = Guid.NewGuid();
string trimStart = "";
string trimEnd = "";
if (htmlStuff != null)
{
trimStart = htmlStuff.Substring(75, htmlStuff.Length - 75);
trimEnd = trimStart.Remove(trimStart.Length - 8, 8) + "";
}
// Logic that generates your document and saves it using the identifier
// Can save to database, file system, etc.
return identifier;
}
Now you can call this HTTP handler, like this:
$(document).ready(function () {
$("#hyperLink").click(function (e) {
var result = $('#output').html();
var newRes = result.replace('\n', '');
$.ajax({
url: "GenerateExcelDocument.ashx",
data: { 'htmlData': newRes },
dataType: "json",
type: "POST",
success: function (result) {
window.location.href = '/RetrieveExcelDocument.ashx/' + result;
}
});
});
});
Note: The success callback is where you can hook up the HTML resource to the file retrieval from the server (think href of the anchor tag that worked without passing data to the handler before).
Finally, we need to build the retrieval HTTP handler logic to actually get the Excel document, based upon the identifier returned from the GenerateExcelDocument HTTP handler call, like this:
RetrieveExcelDocument HTTP handler code:
public void ProcessRequest(HttpContext context)
{
var identifier = new Guid(context.Request.Url.Segments[1]);
// Get Excel document content from database, file system, etc. here
var fileContent = GetExcelDocument(identifier);
context.Response.AddHeader("content-disposition",
"attachment; filename=excelData.xls");
context.Response.ContentType = "application/vnd.ms-excel";
context.Response.OutputStream.Write(fileContent, 0, fileContent.Length);
}
I am trying to write test cases for our AJAX calls to our API. Doing a simply web request and response. My question is with regard to the response. Is there a simpler way to pull out the response JSON values? Is the best way to do this sort of thing? I know we could us JQuery, but wanted to use Microsoft Testing framework.
[TestMethod]
public void TestMethod1()
{
string brand = "KEWL";
string BRAND = "";
var httpWebRequest = (HttpWebRequest)WebRequest.Create("http://203.135.xx.138:4040/api/v1/subscriptions/signup.format");
httpWebRequest.ContentType = "application/json";
httpWebRequest.Method = "POST";
using (var streamWriter = new StreamWriter(httpWebRequest.GetRequestStream()))
{
string json = #"{" +
" 'api_key': '91230D10-247C-11E1-83FF-9B9C4824019B'," +
" 'phone': '12122639043', " +
" 'dob': '11231954', " +
" 'subscriptions': [ " +
" {" +
" 'Brand':'" + brand + "', " +
" 'campaign':'BTLNDN', " +
" 'groups':[" +
" {" +
" 'group': 'BTLALL'," +
" 'subscribed':true" +
" } " +
" ]," +
" 'lang': 'en' " +
" }" +
" ] " +
" }";
streamWriter.Write(json);
}
var httpResponse = (HttpWebResponse)httpWebRequest.GetResponse();
using (var streamReader = new StreamReader(httpResponse.GetResponseStream()))
{
var responseText = streamReader.ReadToEnd();
JavaScriptSerializer serializer = new JavaScriptSerializer();
Dictionary<string, dynamic> dc = serializer.Deserialize<Dictionary<string, dynamic>>(responseText);
var kev = dc;
foreach (var key1 in dc.Keys)
{
var value3 = dc["ReturnData"]["subscriptions"];
BRAND = value3[0]["brand"];
// var groups = value3[0]["groups"];
}
}
Assert.AreEqual(brand, BRAND);
}
The idea of unit testing ASP.NET MVC methods is that you can run the test without using any Http request or respond functionality.
Suppose you have the following method:
public class MyController : Controller
{
public ActionResult MyAjax()
{
return Json(new { Test = "Test" });
}
}
You can test it with this code:
[TestMethod]
public void MyTest()
{
MyControllercontroller = new MyController();
JsonResult json = controller.MyAjax() as JsonResult;
Assert.IsNotNull(json);
dynamic data = json.Data;
Assert.AreEqual("Test", data.Test);
}
To use the dynamic keyword you have to make sure that your test project can see the internals of your web project (this is because anonymous types are declared internal). You can do this by adding: [assembly: InternalsVisibleTo("YourTestProject")] to the AssemblyInfo.cs file of your web project.
Im trying to send a custom HTML object from my ASP 2.0 website to my webservice through jQuery ajax. But I cant get it to work.
Everything is parsed correct in my webservice when I drop the ObjectHTML part. But I get an error when I add the ObjectHTML part.
Is it possible to send custom javascript objects?
function SavePage() {
var rowCount = $('#pageArea div.object').length;
var i = 1;
var objects = "[";
$('.object').each(function(index) {
var objectHtml = new ObjectHTML($(this).html());
objects += "{'ObjectID': " + "'" + $(this).attr('objectid') + "', 'ObjectIndex': '" + $(this).attr('objectindex') + "', 'ObjectHTML': " + objectHtml + "}";
if (i == rowCount)
objects += ""
else
objects += ",";
i++;
});
objects += "]";
alert("{'objects': " + objects + "}");
$.ajax({
type: "POST",
contentType: "application/json; charset=utf-8",
url: "/Folder/ObjectService.asmx/SavePage",
data: "{'objects': " + objects + "}",
dataType: "json",
success:
function(msg) {
alert("Success!");
},
error:
function(XMLHttpRequest, textStatus, errorThrown) {
alert("Error Occured: " + errorThrown);
}
});
}
function ObjectHTML(rawHtml) {
this.Html = rawHtml;
}
Webservice code:
[WebMethod(EnableSession = true)]
public string SavePage(List<PageObject> objects)
{
return "";
}
public class PageObject
{
private string _objectid, _objectindex;
private ObjectHTML _objectHtml;
public string ObjectID
{
get { return _objectid; }
set { _objectid = value; }
}
public string ObjectIndex
{
get { return _objectindex; }
set { _objectindex = value; }
}
public ObjectHTML ObjectHTML
{
get { return _objectHtml; }
set { _objectHtml = value; }
}
}
public class ObjectHTML
{
private string _Html;
public string Html
{
get { return _Html; }
set { _Html = value; }
}
}
It looks to me like you are getting a little confused between your C# classes on your server and the Javascript classes in your script.
One thing that you could do is encode your html for JSOn by using JSON.Stringify
var myObject = JSON.stringify({
ObjectId: $(this).attr('objectid'),
ObjectIndex: $(this).attr('objectIndex'),
ObjectHtml: $(this).html()
});
This will make sure that the html is encoded as valid JSON