decode websocket received data - encryption

I am working on project of my own website where I need to scrape data from target site using websocket.
Data is live feed or tick for price movement of currency and stocks.
I am getting output in following format.
try on http://websocket.org/echo.html:
Location: wss://streamer.finance.yahoo.com/
Click on Connect.
Message: {"subscribe":["AMZN"]}
OUTPUT:
CONNECTED
SENT: {"subscribe":["AMZN"]}
RECEIVED: CgRBTVpOFaQY3EQY4Kn0/99bKgNOTVMwCDgBRYjKzDxIyvN9ZQBQ4T7YAQQ=
RECEIVED: CgRBTVpOFaQY3EQY4Kn0/99bKgNOTVMwCDgBRYnKzDxIzPV9ZQBQ4T7YAQQ=
RECEIVED: CgRBTVpOFT0a3EQYsLn0/99bKgNOTVMwCDgBRYMG5DxIkP99ZQDg+j7YAQQ=
RECEIVED: CgRBTVpOFQAY3EQYwIf1/99bKgNOTVMwCDgBRYd5wzxIxod+ZQAQ1z7YAQQ=
RECEIVED: CgRBTVpOFQAY3EQYwIf1/99bKgNOTVMwCDgBRYd5wzxIroh+ZQAQ1z7YAQQ=
RECEIVED: CgRBTVpOFRQS3EQY8PT1/99bKgNOTVMwCDgBRYC1WjxIhI5+ZQCgcD7YAQQ=
RECEIVED: CgRBTVpOFRQS3EQY8PT1/99bKgNOTVMwCDgBRYG1WjxImo5+ZQCgcD7YAQQ=
RECEIVED: CgRBTVpOFUgN3EQY4KP2/99bKgNOTVMwCDgBRSBhnjtIvpJ+ZQBArj3YAQQ=
RECEIVED: CgRBTVpOFUgN3EQY4KP2/99bKgNOTVMwCDgBRSBhnjtI9J1+ZQBArj3YAQQ=
RECEIVED: CgRBTVpOFUgN3EQY4KP2/99bKgNOTVMwCDgBRSBhnjtIsqR+ZQBArj3YAQQ=
RECEIVED: CgRBTVpOFUgN3EQY4KP2/99bKgNOTVMwCDgBRSBhnjtInq5+ZQBArj3YAQQ=
I don't know how to decode or what type of encryption it is.
Can Anybody tell me how to decode it of what is encode/decode type is it?
I will use PHP for decode(if decoding posible).

I looked into that. I have no clue what to do for understand work flow
and debug the function. JS is not my strong suite.
You are not stuck to any particular language when you are interfacing external system which uses protobuf.
Protobuf is open technology which allows to create marshalling code for multiple languages automatically when meta description of message is known.
So it is not required to re-use available code but to extract Protobuf structure out of it.
Protobuf compiler will do all dirty work for you.
You can easily reconstruct proto file by just looking into __finStreamer-proto.js file
PricingData.proto
syntax = "proto3";
message PricingData {
enum QuoteType {
NONE = 0;
ALTSYMBOL = 5;
HEARTBEAT = 7;
EQUITY = 8;
INDEX = 9;
MUTUALFUND = 11;
MONEYMARKET = 12;
OPTION = 13;
CURRENCY = 14;
WARRANT = 15;
BOND = 17;
FUTURE = 18;
ETF = 20;
COMMODITY = 23;
ECNQUOTE = 28;
CRYPTOCURRENCY = 41;
INDICATOR = 42;
INDUSTRY = 1000;
};
enum OptionType {
CALL = 0;
PUT = 1;
};
enum MarketHoursType {
PRE_MARKET = 0;
REGULAR_MARKET = 1;
POST_MARKET = 2;
EXTENDED_HOURS_MARKET = 3;
};
string id = 1;
float price = 2;
sint64 time = 3;
string currency = 4;
string exchange = 5;
QuoteType quoteType = 6;
MarketHoursType marketHours = 7;
float changePercent = 8;
sint64 dayVolume = 9;
float dayHigh = 10;
float dayLow = 11;
float change = 12;
string shortName = 13;
sint64 expireDate = 14;
float openPrice = 15;
float previousClose = 16;
float strikePrice = 17;
string underlyingSymbol = 18;
sint64 openInterest = 19;
OptionType optionsType = 20;
sint64 miniOption = 21;
sint64 lastSize = 22;
float bid = 23;
sint64 bidSize = 24;
float ask = 25;
sint64 askSize = 26;
sint64 priceHint = 27;
sint64 vol_24hr = 28;
sint64 volAllCurrencies = 29;
string fromcurrency = 30;
string lastMarket = 31;
double circulatingSupply = 32;
double marketcap = 33;
};
then you can use protobuf compiler to build php files out of it:
mkdir yahoo
protoc --php-out=yahoo PricingData.proto
also here is our composer.json
{
"require": {
"google/protobuf": "^3.11",
"ratchet/pawl": "^0.3.4"
},
"autoload": {
"classmap": [
"yahoo"
]
}
}
and php file to suck the data:
#!/usr/bin/php
<?php
require __DIR__ . '/vendor/autoload.php';
\Ratchet\Client\connect('wss://streamer.finance.yahoo.com:443')->then(function($conn) {
$conn->on('message', function($msg) use ($conn) {
echo "Received: {$msg}\n";
$packed = base64_decode($msg);
$msg = new PricingData();
$msg->mergeFromString($packed);
var_dump($msg->serializeToJsonString());
});
$conn->send('{"subscribe":["BTC-USD","ETH-USD","XRP-USD","USDT-USD","BCH-USD","BA","TSLA","AXSM","UBER","MIRM","GRKZF","SCGPY","BDVSF","WPX","BIPSX","ENPIX","ENPSX","BPTUX","BPTIX","CL=F","GC=F","SI=F","EURUSD=X","GBPUSD=X","JPY=X","EZA","IXC","IYE","FILL","EWT","CGIX1191220P00005000","TORC191220P00002500","RIOT191213C00001000","TPCO191220C00002500","DHR","AMRN","AMD","PCG","VIX191218P00012500","VIX191218P00014000","EEM191220P00039000","EEM200117C00045000","BTCUSD=X","ETHUSD=X","AUDUSD=X","NZDUSD=X","EURJPY=X","GBPJPY=X","EURGBP=X","EURCAD=X","EURSEK=X","EURCHF=X","EURHUF=X","CNY=X","HKD=X","SGD=X","INR=X","MXN=X","PHP=X","IDR=X","THB=X","MYR=X","ZAR=X","RUB=X","ZG=F","ZI=F","PL=F","HG=F","PA=F","HO=F","NG=F","RB=F","BZ=F","B0=F","C=F","O=F","KW=F","RR=F","SM=F","BO=F","S=F","FC=F","LH=F","LC=F","CC=F","KC=F","CT=F","LB=F","OJ=F","SB=F","IFF","CRS","RLLCF","BGNE","^GSPC","^DJI","^IXIC","^RUT","^TNX","^VIX","^CMC200","^FTSE","^N225"]}');
}, function ($e) {
echo "Could not connect: {$e->getMessage()}\n";
});
and.... here we go:
Received: CgNQQ0cVH4UvQRiQvr/a4lsqA05ZUTAIOAFFKBlXQUj61YQeZWhmpj/YAQQ=
string(202) "{"id":"PCG","price":10.97,"time":"1576616325000","exchange":"NYQ","quoteType":"EQUITY","marketHours":"REGULAR_MARKET","changePercent":13.443642,"dayVolume":"31495549","change":1.3000002,"priceHint":"2"}"
Received: CghFVVJHQlA9WBW6a1k/GODNv9riWyoDQ0NZMA44AUV+6bc/ZYAZRTzYAQg=
string(193) "{"id":"EURGBP=X","price":0.84930003,"time":"1576616326000","exchange":"CCY","quoteType":"CURRENCY","marketHours":"REGULAR_MARKET","changePercent":1.4368131,"change":0.012030005,"priceHint":"4"}"
Received: CghHQlBKUFk9WBVxvQ9DGODNv9riWyoDQ0NZMA44AUUQl7S/ZcClA8DYAQg=
string(192) "{"id":"GBPJPY=X","price":143.74001,"time":"1576616326000","exchange":"CCY","quoteType":"CURRENCY","marketHours":"REGULAR_MARKET","changePercent":-1.4108601,"change":-2.0569916,"priceHint":"4"}"
Received: CgVNWE49WBWqgpdBGJC+v9riWyoDQ0NZMA44AUXYFZ89ZQDYcDzYAQg=
string(191) "{"id":"MXN=X","price":18.938801,"time":"1576616325000","exchange":"CCY","quoteType":"CURRENCY","marketHours":"REGULAR_MARKET","changePercent":0.077678382,"change":0.014699936,"priceHint":"4"}"
Received: CgVTR0Q9WBWCi60/GJC+v9riWyoDQ0NZMA44AUXWfb49ZQAkpTq9AX6MrT/NAXGPrT/YAQg=
string(219) "{"id":"SGD=X","price":1.3558199,"time":"1576616325000","exchange":"CCY","quoteType":"CURRENCY","marketHours":"REGULAR_MARKET","changePercent":0.093013451,"change":0.001259923,"bid":1.35585,"ask":1.35594,"priceHint":"4"}"
Received: CgVKUFk9WBVQDdtCGODNv9riWyoDQ0NZMA44AUW/HUa9ZQAYWb3YAQg=
string(191) "{"id":"JPY=X","price":109.526,"time":"1576616326000","exchange":"CCY","quoteType":"CURRENCY","marketHours":"REGULAR_MARKET","changePercent":-0.048368212,"change":-0.053001404,"priceHint":"4"}"
Received: CgRVQkVSFZqZ7UEY4M2/2uJbKgNOWVEwCDgBRY8Vlb9IxqSRHGUAM7O+2AEE
string(210) "{"id":"UBER","price":29.700001,"time":"1576616326000","exchange":"NYQ","quoteType":"EQUITY","marketHours":"REGULAR_MARKET","changePercent":-1.1647204,"dayVolume":"29501731","change":-0.34999847,"priceHint":"2"}"
Received: CgRUU0xBFR9lvUMYwK6/2uJbKgNOTVMwCDgBRZvZNb9IoL/SB2WAcC3A2AEE
string(209) "{"id":"TSLA","price":378.79001,"time":"1576616324000","exchange":"NMS","quoteType":"EQUITY","marketHours":"REGULAR_MARKET","changePercent":-0.71035165,"dayVolume":"8015824","change":-2.7099915,"priceHint":"2"}"
Received: CghFVVJHQlA9WBXLZ1k/GLDdv9riWyoDQ0NZMA44AUWW/rY/ZcAdRDzYAQg=
string(193) "{"id":"EURGBP=X","price":0.84924001,"time":"1576616327000","exchange":"CCY","quoteType":"CURRENCY","marketHours":"REGULAR_MARKET","changePercent":1.4296443,"change":0.011969984,"priceHint":"4"}
As you can see protobuf is awesome thing which is language agnostic so you do not have to cope with some unfamiliar language

You can use or refer to my repository (thanks to Maxim for the proto file!).
It is an easy to use Python package.
Install the package
pip install yliveticker
Create livemarket.py file with the following code
import yliveticker
# this function is called on each ticker update
def on_new_msg(msg):
print(msg)
# insert your symbols here
yliveticker.YLiveTicker(on_ticker=on_new_msg, ticker_names=[
"BTC=X", "^GSPC", "^DJI", "^IXIC", "^RUT", "CL=F", "GC=F", "SI=F", "EURUSD=X", "^TNX", "^VIX", "GBPUSD=X", "JPY=X", "BTC-USD", "^CMC200", "^FTSE", "^N225"])
Run code
python livemarket.py
Watch live market data appearing in the console output.
If you don't see any results, make sure you are within trading hours of your stock exchange

Thanks to #Maxim, I was able to get better understanding how this works and made NodeJS version of it.
Here is code for basic example and I will spend some time to build dynamic subscription and final app for it based on Electron. The idea is to get decoded data and push to local SocketIO server where it can be used with VueJS in different apps.
const WebSocket = require('ws')
var ProtoBuf = require("protobufjs");
"use strict";
let Message = ProtoBuf
.loadProtoFile('./PricingData.proto', (err, builder)=>{
Message = builder.build('PricingData')
loadMessage()
})
let loadMessage = ()=> {
const url = 'wss://streamer.finance.yahoo.com'
const connection = new WebSocket(url)
connection.onopen = () => {
connection.send('{"subscribe":["TSLA","AXSM","UBER","MIRM","GRKZF","BTCUSD=X","ETHUSD=X","AUDUSD=X","^DJI","^IXIC","^RUT","^TNX","^VIX","^CMC200","^FTSE","^N225"]}')
}
connection.onerror = (error) => {
console.log(`WebSocket error: ${error}`)
}
connection.onmessage = (e) => {
let msg = Message.decode(e.data)
console.log('Decoded message', msg)
}
}
Quick update:
Here is full example on my repo https://github.com/markosole/yahoo-node-streamer

Related

I have an error with this : Thread 1: EXC_BAD_ACCESS (code=1, address=0x18) I don't know if the nil value is because the recording is faulty

This is my code use AVAudioEngine for change voice in my app.
Can anyone point out what's wrong in this code of mine and help me get rid of the error message?
extension AudioPlayer {
func setupAudio(){
// initialize (recording) audio file
do{
audioFile = try AVAudioFile(forReading: recordedAudioURL as URL)
}catch{
print("Audio File Error ")
}
}
func playSound(rate: Float? = nil, pitch: Float? = nil, echo: Bool = false,reverb:Bool = false){
// initialize audio engine components
audioFile = AVAudioFile()
audioEngine = AVAudioEngine()
// node for playing audio
audioPlayerNode = AVAudioPlayerNode()
audioEngine.attach(audioPlayerNode)//attach the player node to the audio engine
audioMixer = AVAudioMixerNode()
audioEngine.attach(audioMixer)// attach a single output to the audio engine
//node for the adjusting rate/pitch
let changeRatePitchNode = AVAudioUnitTimePitch()
if let pitch = pitch {
changeRatePitchNode.pitch = pitch
}
if let rate = rate {
changeRatePitchNode.rate = rate
}
audioEngine.attach(changeRatePitchNode)
//node for the echo
let echoNode = AVAudioUnitDistortion()
echoNode.loadFactoryPreset(.multiEcho1)
audioEngine.attach(echoNode)
//node for the reverb
let reverbNode = AVAudioUnitReverb()
reverbNode.loadFactoryPreset(.cathedral)
reverbNode.wetDryMix = 50
audioEngine.attach(reverbNode)
//connect the player node to the output node
if echo == true && reverb == true{
connectAudioNode(audioPlayerNode,changeRatePitchNode,echoNode,reverbNode,audioMixer,audioEngine.outputNode)
}else if echo == true{
connectAudioNode(audioPlayerNode, changeRatePitchNode, echoNode,audioMixer,audioEngine.outputNode)
}else if reverb == true{
connectAudioNode(audioPlayerNode, changeRatePitchNode, echoNode, audioMixer, audioEngine.outputNode)
}else{
connectAudioNode(audioPlayerNode, changeRatePitchNode,audioMixer,audioEngine.outputNode)
}
//schedule to replay the entire audio file- phat lai toan bo tep am thanh
audioPlayerNode.stop()
audioPlayerNode.scheduleFile(audioFile, at: nil)
do{
try audioEngine.start()
}catch{
print("Not OutPut")
}
//generate media resource modeling assets at the specified URL
let audioAsset = AVURLAsset.init(url: recordedAudioURL)
// returns the representation t/g = (s)
let durationInSeconds = CMTimeGetSeconds(audioAsset.duration)
let length = 4000
let buffer = AVAudioPCMBuffer(pcmFormat: audioPlayerNode.outputFormat(forBus: 0), frameCapacity: AVAudioFrameCount(length))
buffer!.frameLength = AVAudioFrameCount(durationInSeconds)//bộ đêmj âm thanh cho định dạng PCM
// MARK: Create a new path after adding effects to the file
let dirPath: AnyObject = NSSearchPathForDirectoriesInDomains(FileManager.SearchPathDirectory.documentDirectory,FileManager.SearchPathDomainMask.userDomainMask, true)[0] as AnyObject
let tmpFileURL : NSURL = NSURL.fileURL(withPath: dirPath.appendingPathComponent("NewVoice.m4a")) as NSURL
filteredOutputURL = tmpFileURL
//setting for new file
do{
print(dirPath)
let settings = [AVFormatIDKey: kAudioFormatLinearPCM,
AVSampleRateKey: 44100,
AVNumberOfChannelsKey: 1]
self.newAudio = try AVAudioFile(forWriting: tmpFileURL as URL, settings: settings)
//setting for output
self.audioMixer.installTap(onBus: 0, bufferSize: (AVAudioFrameCount(durationInSeconds)), format: self.audioPlayerNode.outputFormat(forBus: 0)) { [self](buffer: AVAudioPCMBuffer!, time: AVAudioTime!) in
print(self.newAudio!.length)
print(self.audioFile.length)
if (self.newAudio!.length) < (self.audioFile.length){
do{
try self.newAudio!.write(from: buffer)
}catch {
print("error writing")
}
}else{
audioPlayerNode.removeTap(onBus: 0)
}
}
}catch _{
print("problem")
}
// MARK: Play Sound Effect
audioPlayerNode.play()
}
func connectAudioNode(_ nodes:AVAudioNode...){
for i in 0..<nodes.count-1{
audioEngine.connect(nodes[i], to: nodes[i + 1], format: audioFile.processingFormat)
}//*error here "processingFormat"*
}
}
error: "processingFormat"
I don't know if it's because the link to read the file is wrong or what's wrong with my settings

STM32F4 HAL ADC DMA Transfer Error

I'm using an STM32F405OG for a project and one of the necessary functions is monitoring 3 analog channels at ~1 Hz. My desired implementation is starting an ADC DMA read of all 3 channels in Scan mode and retrieving the results at a later time after the DMA complete interrupt has occurred.
I'm using ADC1 and have tried both DMA channels 0 and 4, both with the same result: HAL_ADC_ErrorCallback() is invoked after the first call to HAL_ADC_Start_DMA(). At this point, the ADC handle is in an error state (HAL_ADC_STATE_ERROR_DMA) with the error code 0x04 (HAL_ADC_ERROR_DMA). Checking the linked DMA handle yields a DMA error code of HAL_DMA_ERROR_NO_XFER, meaning "Abort requested with no Xfer ongoing."
I'm totally lost as to what's causing this - my code should be consistent with examples and the "how to use this module" comments at the top of stm32f4xx_hal_adc.c. I've attached my code below.
ADC_HandleTypeDef ADC_hADC =
{
.Instance = ADC1,
.Init =
{
.ClockPrescaler = ADC_CLOCK_SYNC_PCLK_DIV8,
.Resolution = ADC_RESOLUTION_12B,
.EOCSelection = ADC_EOC_SEQ_CONV, // EOC at end of sequence of channel conversions
.ScanConvMode = ENABLE,
.ContinuousConvMode = DISABLE,
.DiscontinuousConvMode = DISABLE,
.NbrOfDiscConversion = 0U,
.ExternalTrigConvEdge = ADC_EXTERNALTRIGCONVEDGE_NONE,
.ExternalTrigConv = ADC_SOFTWARE_START,
.DataAlign = ADC_DATAALIGN_RIGHT,
.NbrOfConversion = _NUM_ADC_CONV,
.DMAContinuousRequests = DISABLE
}
};
DMA_HandleTypeDef _hDmaAdc =
{
.Instance = DMA2_Stream0,
.Init =
{
.Channel = DMA_CHANNEL_0,
.Direction = DMA_PERIPH_TO_MEMORY,
.PeriphInc = DMA_PINC_DISABLE,
.MemInc = DMA_MINC_ENABLE,
.PeriphDataAlignment = DMA_PDATAALIGN_WORD,
.MemDataAlignment = DMA_MDATAALIGN_WORD,
.Mode = DMA_NORMAL,
.Priority = DMA_PRIORITY_HIGH,
.FIFOMode = DMA_FIFOMODE_DISABLE,
.FIFOThreshold = DMA_FIFO_THRESHOLD_FULL,
.MemBurst = DMA_MBURST_SINGLE,
.PeriphBurst = DMA_PBURST_SINGLE
}
};
void HAL_ADC_MspInit(ADC_HandleTypeDef *h)
{
if (!h)
{
return;
}
else if (h->Instance == ADC1)
{
__HAL_RCC_ADC1_CLK_ENABLE();
__HAL_RCC_DMA2_CLK_ENABLE();
HAL_DMA_Init(&_hDmaAdc);
__HAL_LINKDMA(h, DMA_Handle, _hDmaAdc);
HAL_NVIC_SetPriority(ADC_IRQn, IT_PRIO_ADC, 0);
HAL_NVIC_SetPriority(DMA2_Stream0_IRQn, IT_PRIO_ADC, 0);
HAL_NVIC_EnableIRQ(ADC_IRQn);
HAL_NVIC_EnableIRQ(DMA2_Stream0_IRQn);
}
}
uint32_t _meas[3];
ADC_ChannelConfTypeDef _chanCfg[3] =
{
// VIN_MON
{
.Channel = ADC_CHANNEL_1,
},
// VDD_MON
{
.Channel = ADC_CHANNEL_8,
},
// VDD2_MON
{
.Channel = ADC_CHANNEL_2,
}
};
Bool ADC_Init(void)
{
ADC_DeInit();
memset(_meas, 0, sizeof(_meas));
Bool status = (HAL_ADC_Init(&ADC_hADC) == HAL_OK);
if (status)
{
// Configure each ADC channel
for (uint32_t i = 0U; i < NELEM(_chanCfg); i++)
{
_chanCfg[i].Rank = (i + 1U);
_chanCfg[i].SamplingTime = ADC_SAMPLETIME_480CYCLES;
_chanCfg[i].Offset = 0U;
if (HAL_ADC_ConfigChannel(&ADC_hADC, &_chanCfg[i]) != HAL_OK)
{
status = FALSE;
break;
}
}
_state = ADC_STATE_READY;
}
if (!status)
{
ADC_DeInit();
}
return status;
}
Bool ADC_StartRead(void)
{
Bool status = TRUE;
status = (HAL_ADC_Start_DMA(&ADC_hADC, &_meas[0], 3) == HAL_OK);
return status;
}
After slowing down the conversions via the ClockPrescaler init structure field, increasing the number of ADC cycles, and calling HAL_ADC_Stop_DMA() (per file header comment in stm32f4xx_hal_adc.c), everything is working.
Note that calling HAL_ADC_Stop_DMA() in the DMA Transfer Complete ISR caused the aforementioned error conditions as well, so calls to that function will have to be made sometime after the DMAXferCplt ISR is invoked.

HoughLineP Segmentation Fault OpenCV

I am using Unix and delevoped a program that run without problems under Windows.
When running under unix I get a segmentation fault at the HoughlineP operation:
vector lines;
HoughLinesP(image, lines, 1, CV_PI/180, a, b, c );
variables a, b, c are decleared in the constructor and later on changed by some sliders of the gui.
vector lines is empty and should be filled by the HoughlinesP function but liked described the function fails.
Is this a special unix problem?
I haven't find any solutions on the web?
imageanlyzer.h:
vector<Vec4i> m_lines;
ImageAnalyzer::ImageAnalyzer(QObject *parent) : QObject(parent)
{
m_performAnalysis = false;
m_lowThresh = 74;
m_rho = 1;
m_highThresh = 205;
m_threshold = 20;
m_minLineLength = 16;
m_maxLineGap = 16;
m_blurSize = 3;
m_adptTreshSize = 33;
m_dilationSize = 1;
m_erodeSize = 1;
m_adptTreshC = -15;
m_heading = 260;
m_matchingValue = 0;
}
the problem occurs here:
void ImageAnalyzer::extractLines()
{
// Probalistic Hough Transform
HoughLinesP(getCanny4HoughImg(), m_lines, (double)m_rho, CV_PI/180, m_threshold, (double)m_minLineLength, (double)m_maxLineGap);
}
The getCanny4HoughImg() returns in image. I checked the this already.

abc PDF generate a blank page on IIS

I am creating an PDF from HTML using ABC PDF 8.0, it works well on my local end but generate a blank page on IIS, I already down grade IE, and provide the all permission to folder. When I tried to generate the PDF through any external link like Google.com it works perfectly. more over my link is accessible and there is no error on the page. please find below the code for your reference.
var url="test.com"
if (XSettings.InstallLicense(abcPDFkey))
{
using (Doc theDoc = new Doc())
{
//apply a rotation transform
double w = theDoc.MediaBox.Width;
double h = theDoc.MediaBox.Height;
double l = theDoc.MediaBox.Left;
double b = theDoc.MediaBox.Bottom;
theDoc.Transform.Rotate(90, l, b);
theDoc.Transform.Translate(w, 0);
// To fix time out
theDoc.HtmlOptions.RetryCount = 1;
theDoc.HtmlOptions.Timeout = 25000;
// rotate our rectangle
theDoc.Rect.Width = h;
theDoc.Rect.Height = w;
theDoc.HtmlOptions.Engine = EngineType.Gecko;
theDoc.HtmlOptions.ImageQuality = 60;
int theID;
theID = theDoc.AddImageUrl(url);
while (true)
{
theDoc.FrameRect();
if (!theDoc.Chainable(theID))
break;
theDoc.Page = theDoc.AddPage();
theID = theDoc.AddImageToChain(theID);
int NewtheID = theDoc.GetInfoInt(theDoc.Root, "Pages");
theDoc.SetInfo(NewtheID, "/Rotate", "90");
}
for (int i = 1; i <= theDoc.PageCount; i++)
{
theDoc.PageNumber = i;
theDoc.Flatten();
}
foreach (IndirectObject io in theDoc.ObjectSoup)
{
if (io is PixMap)
{
PixMap pm = (PixMap)io;
pm.Realize(); // eliminate indexed color images
pm.Resize(pm.Width / 6, pm.Height / 6);
}
}
theDoc.Save(System.Web.HttpContext.Current.Server.MapPath("PDFFileName"));
theDoc.Clear();
}
Please help, thanks
Ok, I figured out what was the issue.
First of all I define the relative path for all my Images, and secondly Our server have internal IP, I define the URL for internal IP instead of public domain. that fix my issue..
cheers !!

Negotiating an allocator between Directshow filters fails

I'm developing a custom Directshow source filter to provide decompressed video data to a rendering filter. I've used the PushSource sample provided by the Directshow SDK as a basis for my filter. I'm attempting to connect this to a VideoMixingRenderer9 filter.
When creating the graph I'm calling ConnectDirect():
HRESULT hr = mp_graph_builder->ConnectDirect(OutPin, InPin, &mediaType);
but during this call, calling SetProperties on the downstream filters allocator (in DecideBufferSize()), fails with D3DERR_INVALIDCALL (0x8876086c):
ALLOCATOR_PROPERTIES actual;
memset(&actual,0,sizeof(actual));
hr = pAlloc->SetProperties(pRequest, &actual);
If I let it try to use my allocator (the one provided by CBaseOutputPin) when setting the allocator on the downstream filter, this fails with E_FAIL (in CBaseOutputPin::DecideAllocator)
hr = pPin->NotifyAllocator(*ppAlloc, FALSE);
Any help would be much appreciated!
Thanks.
EDIT:
This is the media type provided by GetMediaType
VIDEOINFOHEADER *pvi = (VIDEOINFOHEADER*)pMediaType->AllocFormatBuffer(sizeof(VIDEOINFOHEADER));
if (pvi == 0)
return(E_OUTOFMEMORY);
ZeroMemory(pvi, pMediaType->cbFormat);
pvi->AvgTimePerFrame = m_rtFrameLength;
pMediaType->formattype = FORMAT_VideoInfo;
pMediaType->majortype = MEDIATYPE_Video;
pMediaType->subtype = MEDIASUBTYPE_RGB24;
pMediaType->bTemporalCompression = FALSE;
pMediaType->bFixedSizeSamples = TRUE;
pMediaType->formattype = FORMAT_VideoInfo;
pvi->bmiHeader.biSize = sizeof(BITMAPINFOHEADER);
pvi->bmiHeader.biWidth = (640 / 128 + 1) * 128;
pvi->bmiHeader.biHeight = -480; // negative so top down..
pvi->bmiHeader.biPlanes = 1;
pvi->bmiHeader.biBitCount = 24;
pvi->bmiHeader.biCompression = NULL; // ok if rgb else use MAKEFOURCC(...)
pvi->bmiHeader.biSizeImage = GetBitmapSize(&pvi->bmiHeader);
pvi->bmiHeader.biClrImportant = 0;
pvi->bmiHeader.biClrUsed = 0; //Use max colour depth
pvi->bmiHeader.biXPelsPerMeter = 0;
pvi->bmiHeader.biYPelsPerMeter = 0;
SetRectEmpty(&(pvi->rcSource));
SetRectEmpty(&(pvi->rcTarget));
pvi->rcSource.bottom = 480;
pvi->rcSource.right = 640;
pvi->rcTarget.bottom = 480;
pvi->rcTarget.right = 640;
pMediaType->SetType(&MEDIATYPE_Video);
pMediaType->SetFormatType(&FORMAT_VideoInfo);
pMediaType->SetTemporalCompression(FALSE);
const GUID SubTypeGUID = GetBitmapSubtype(&pvi->bmiHeader);
pMediaType->SetSubtype(&SubTypeGUID);
pMediaType->SetSampleSize(pvi->bmiHeader.biSizeImage);
and DecideBufferSize where pAlloc->SetProperties is called
HRESULT CPushPinBitmap::DecideBufferSize(IMemAllocator *pAlloc, ALLOCATOR_PROPERTIES *pRequest) {
HRESULT hr;
CAutoLock cAutoLock(CBasePin::m_pLock);
CheckPointer(pAlloc, E_POINTER);
CheckPointer(pRequest, E_POINTER);
if (pRequest->cBuffers == 0) {
pRequest->cBuffers = 2;
}
pRequest->cbBuffer = 480 * ( (640 / 128 + 1) * 128 ) * 3;
ALLOCATOR_PROPERTIES actual;
memset(&actual,0,sizeof(actual));
hr = pAlloc->SetProperties(pRequest, &actual);
if (FAILED(hr)) {
return hr;
}
if (actual.cbBuffer < pRequest->cbBuffer) {
return E_FAIL;
}
return S_OK;
}
The constants are only temporary!
There is no way you can use your own allocator with VMR/EVR filters. They just insist on their own, which in turn is backed on DirectDraw/Direct3D surfaces.
To connect directly to VMR/EVR filters you need a different strategy. The allocator is always theirs. You need to support extended strides. See Handling Format Changes from the Video Renderer.

Resources