SSL routines:OPENSSL_internal:WRONG_VERSION_NUMBER - grpc

I use an objective c client and the server is go, when I try to stream from the server to the client I encounter this problem, do you have any idea why? Thank you very much
Error: SSL routines:OPENSSL_internal:WRONG_VERSION_NUMBER
Proto:
service Event {
rpc eventSubscription ( stream eventRequest) returns ( stream eventResponse) {}
rpc listFeatures(eventRequest) returns (stream eventResponse) {}
}
message eventRequest {
string id = 1;
}
message eventResponse {
string id = 1;
string key = 2;
string value = 3;
}
ObjC code:
#import <Foundation/Foundation.h>
#import <GRPCClient/GRPCCall+Tests.h>
#import "GrpcService.h"
#import <GrpcService/Event.pbrpc.h>
static NSString * const hostAddress = #"localhost:50052";
RCT_EXPORT_METHOD(stream:(NSString *)name)
{
[GRPCCall useInsecureConnectionsForHost:hostAddress];
CPHWEvent *client = [[CPHWEvent alloc] initWithHost:hostAddress];
CPHWeventRequest *request = [CPHWeventRequest message];
GRPCUnaryResponseHandler *handler =
[[GRPCUnaryResponseHandler alloc] initWithResponseHandler:
^(CPHWeventResponse *response, NSError *error) {
if (response) {
NSLog(#"%#", response);
} else {
}
} responseDispatchQueue:nil];
GRPCUnaryProtoCall *call = [client listFeaturesWithMessage:request responseHandler:handler callOptions:nil];
[call start];
}
Go code:
func (s *Server) EventSubscription(stream pbEvent.Event_EventSubscriptionServer) error {
log.Println("Started stream")
var length int
length = len(global.TestStrings)
for {
TestStrings := global.TestStrings
if len(TestStrings) > length {
response := &pbEvent.EventResponse{
Id: TestStrings[length].Id,
Value: TestStrings[length].Value,
Key: TestStrings[length].Key,
}
if err := stream.Send(response); err != nil {
log.Println(err)
} else {
log.Println("send...")
//global.TestStrings = append(global.TestStrings[:0], global.TestStrings[1:]...)
length++
}
}
}

Related

Wso2 Stream Processor : Error occurred while processing eventByteBufferQueue

I have two nodes of wso2-am analytics server (2.6.0) which is Wso2 Stream processors. I see following error on passive node of cluster. The active node is fine and I don't see any error. Analytics result has no impact for users who is viewing data on API Publisher or Store. however there is an error in passive node.
please advise what is causing following issue..
2019-02-26 17:06:09,513] ERROR {org.wso2.carbon.stream.processor.core.ha.tcp.EventSyncServer} - Error occurred while processing eventByteBufferQueue null java.nio.BufferUnderflowException
Just meet the same issue, here is my problem and solution.
1) Using the WSO2 SP HA deployment.
2) When Event come in active node and according the source mapping of the streaming, some fields are NULL
3) Active Node would like sync this event to passive node
4) passive node pick up the event data from the 'eventByteBufferQueue' to meet the standby-take over mechanism
5) passive node cannot parse the data from active node and reports error exception.
the root cause is SP only support NULL String by default, when NULL with LONG, INTEGER.. the error occurred. but for me, Long fields have NULL is the normal case, you can change data type to string.
here is my solution:
org.wso2.carbon.stream.processor.core_2.0.478.jar
Add logic to support NULL
BinaryMessageConverterUtil.java for sending event data from active node
public final class BinaryMessageConverterUtil {
public static int getSize(Object data) {
if (data instanceof String) {
return 4 + ((String) data).length();
} else if (data instanceof Integer) {
return 4;
} else if (data instanceof Long) {
return 8;
} else if (data instanceof Float) {
return 4;
} else if (data instanceof Double) {
return 8;
} else if (data instanceof Boolean) {
return 1;
} else if (data == null) {
return 0;
}else {
//TODO
return 4;
}
}
public static EventDataMetaInfo getEventMetaInfo(Object data) {
int eventSize;
Attribute.Type attributeType;
if (data instanceof String) {
attributeType = Attribute.Type.STRING;
eventSize = 4 + ((String) data).length();
} else if (data instanceof Integer) {
attributeType = Attribute.Type.INT;
eventSize = 4;
} else if (data instanceof Long) {
attributeType = Attribute.Type.LONG;
eventSize = 8;
} else if (data instanceof Float) {
attributeType = Attribute.Type.FLOAT;
eventSize = 4;
} else if (data instanceof Double) {
attributeType = Attribute.Type.DOUBLE;
eventSize = 8;
} else if (data instanceof Boolean) {
attributeType = Attribute.Type.BOOL;
eventSize = 1;
} else if (data == null){
attributeType = Attribute.Type.OBJECT;
eventSize = 0; //'no content between the HA nodes for NULL fields'
} else {
//TODO
attributeType = Attribute.Type.OBJECT;
eventSize = 1;
}
return new EventDataMetaInfo(eventSize, attributeType);
}
public static void assignData(Object data, ByteBuffer eventDataBuffer) throws IOException {
if (data instanceof String) {
eventDataBuffer.putInt(((String) data).length());
eventDataBuffer.put((((String) data).getBytes(Charset.defaultCharset())));
} else if (data instanceof Integer) {
eventDataBuffer.putInt((Integer) data);
} else if (data instanceof Long) {
eventDataBuffer.putLong((Long) data);
} else if (data instanceof Float) {
eventDataBuffer.putFloat((Float) data);
} else if (data instanceof Double) {
eventDataBuffer.putDouble((Double) data);
} else if (data instanceof Boolean) {
eventDataBuffer.put((byte) (((Boolean) data) ? 1 : 0));
} else if (data == null){
//put nothing into he Buffer
} else {
eventDataBuffer.putInt(0);
}
}
public static String getString(ByteBuf byteBuf, int size) throws UnsupportedEncodingException {
byte[] bytes = new byte[size];
byteBuf.readBytes(bytes);
return new String(bytes, Charset.defaultCharset());
}
public static String getString(ByteBuffer byteBuf, int size) throws UnsupportedEncodingException {
byte[] bytes = new byte[size];
byteBuf.get(bytes);
return new String(bytes, Charset.defaultCharset());
}
}
SiddhiEventConverter.java for processing event data at passive node
static Object[] toObjectArray(ByteBuffer byteBuffer,
String[] attributeTypeOrder) throws UnsupportedEncodingException {
if (attributeTypeOrder != null) {
Object[] objects = new Object[attributeTypeOrder.length];
for (int i = 0; i < attributeTypeOrder.length; i++) {
switch (attributeTypeOrder[i]) {
case "INT":
objects[i] = byteBuffer.getInt();
break;
case "LONG":
objects[i] = byteBuffer.getLong();
break;
case "STRING":
int stringSize = byteBuffer.getInt();
if (stringSize == 0) {
objects[i] = null;
} else {
objects[i] = BinaryMessageConverterUtil.getString(byteBuffer, stringSize);
}
break;
case "DOUBLE":
objects[i] = byteBuffer.getDouble();
break;
case "FLOAT":
objects[i] = byteBuffer.getFloat();
break;
case "BOOL":
objects[i] = byteBuffer.get() == 1;
break;
case "OBJECT":
//for NULL fields
objects[i] = null;
break;
default:
// will not occur
}
}
return objects;
} else {
return null;
}
}

How can I monitor the ~/.local directory using Vala?

I am trying to monitor the ~/.local directory according to the Vala documentation I can monitor the home correctly. but I can't monitor the the ~/.local.
initFileMonitor V1:
public void initFileMonitor(){
try {
string homePath = Environment.get_home_dir();
string filePath = homePath + "/.local";
File file = File.new_for_path(filePath);
FileMonitor monitor = file.monitor_directory(FileMonitorFlags.NONE, null);
print ("\nMonitoring: %s\n", file.get_path ());
monitor.changed.connect ((src, dest, event) => {
if (dest != null) {
print ("%s: %s, %s\n", event.to_string (), src.get_path (), dest.get_path ());
} else {
print ("%s: %s\n", event.to_string (), src.get_path ());
}
});
} catch (Error err) {
print ("Error: %s\n", err.message);
}
}
terminal output(no error, no monitoring):
Monitoring: /home/srdr/.local
Because the file monitor is stored in a local variable it is like other variables destroyed (or in GObject terms finalised/destructed) at the end of the function call
To ensure it lives long enough you should make it a field on a class, then the FileMonitor instance is 'owned' by an instance of that class rather than each call to a specific method
Runnable demo (valac demo.vala --pkg gio-2.0)
class FileMonitorDemo {
private FileMonitor monitor;
public void initFileMonitor() {
var path = Path.build_filename(Environment.get_home_dir(), ".local");
var file = File.new_for_path(path);
try {
monitor = file.monitor_directory(NONE);
message ("Monitoring: %s", file.get_path ());
monitor.changed.connect ((src, dest, event) => {
if (dest != null) {
print ("%s: %s, %s\n", event.to_string (), src.get_path (), dest.get_path ());
} else {
print ("%s: %s\n", event.to_string (), src.get_path ());
}
});
} catch (Error err) {
critical ("Error: %s\n", err.message);
}
}
}
void main () {
var filemon = new FileMonitorDemo();
filemon.initFileMonitor();
new MainLoop ().run ();
}
You need to actually run the monitor by creating a main loop and having it wait for events:
new MainLoop ().run ();

Counting the number of transmissions in TinyOS 2.x

I'm trying to implement an application in NesC able to count the number of transmissions performed along the simulation, however I'm facing many difficulties. No approach that I tryed works. Could anyone help me? this is my application:
module FloodingC {
uses {
interface Boot;
interface SplitControl as AMControl;
interface Timer<TMilli> as MilliTimer;
interface Receive;
interface AMSend;
interface Packet;
interface AMPacket;
interface RootControl;
interface PacketAcknowledgements as PackAck;
}
}
implementation {
message_t packet;
bool locked = FALSE;
uint16_t flag;
event void Boot.booted() {
flag = 0;
call AMControl.start();
}
event void AMControl.startDone(error_t err) {
if(err == SUCCESS) {
if(TOS_NODE_ID == 1)
call RootControl.setRoot();
call MilliTimer.startOneShot(1024);
}
else {
call AMControl.start();
}
}
event void AMControl.stopDone(error_t err) {
}
void sendMsg(){
floodingMsg_t* msg = (floodingMsg_t* ) call Packet.getPayload(&packet, sizeof(floodingMsg_t));
if(msg == NULL) {
return;
}
flag = 1;
msg->nodeid = TOS_NODE_ID;
msg->counter = 1;
call PackAck.requestAck(&packet);
if(call AMSend.send(AM_BROADCAST_ADDR, &packet, sizeof(floodingMsg_t)) == SUCCESS) {
locked = TRUE;
}
}
event void MilliTimer.fired() {
if(locked) {
return;
}
else {
if (call RootControl.isRoot()){
sendMsg();
}
}
}
event void AMSend.sendDone(message_t *msg, error_t error){
if (call PackAck.wasAcked(msg) == SUCCESS){
locked = FALSE;
}
else{
sendMsg();
}
}
event message_t* Receive.receive(message_t* msg, void* payload, uint8_t len) {
floodingMsg_t* newMsg = (floodingMsg_t* )payload;
if (locked == TRUE) return msg;
if(flag == 0) {
flag = 1;
newMsg->nodeid = TOS_NODE_ID;
newMsg->counter++;
call AMSend.send(AM_BROADCAST_ADDR, msg, call Packet.maxPayloadLength());
}
return msg;
}
}
Thanks
You can count them using a BaseStation sniffer (included in tinyos), or adding a sequence number on your transmitted packet.

If I know the name of a function, is it possible to get its parameters?

Dart code:
void hello(String name) {
print(name);
}
main() {
var funcName = "hello";
// how to get the parameter `String name`?
}
Using the function name as a string, "hello", is it possible to get the parameter String name of the real function hello ?
You can use mirrors to do that.
import 'dart:mirrors';
void hello(String name) {
print(name);
}
main() {
var funcName = "hello";
// get the top level functions in the current library
Map<Symbol, MethodMirror> functions =
currentMirrorSystem().isolate.rootLibrary.functions;
MethodMirror func = functions[const Symbol(funcName)];
// once function is found : get parameters
List<ParameterMirror> params = func.parameters;
for (ParameterMirror param in params) {
String type = MirrorSystem.getName(param.type.simpleName);
String name = MirrorSystem.getName(param.simpleName);
//....
print("$type $name");
}
}
You get this information through reflection (which it is not yet fully completed):
library hello_library;
import 'dart:mirrors';
void main() {
var mirrors = currentMirrorSystem();
const libraryName = 'hello_library';
var libraries = mirrors.findLibrary(const Symbol(libraryName));
var length = libraries.length;
if(length == 0) {
print('Library not found');
} else if(length > 1) {
print('Found more than one library');
} else {
var method = getStaticMethodInfo(libraries.first, const Symbol('hello'));
var parameters = getMethodParameters(method);
if(parameters != null) {
for(ParameterMirror parameter in parameters) {
print('name: ${parameter.simpleName}:, type: ${parameter.type.simpleName}');
}
}
}
}
MethodMirror getStaticMethodInfo(LibraryMirror library, Symbol methodName) {
if(library == null) {
return null;
}
return library.functions[methodName];
}
List<ParameterMirror> getMethodParameters(MethodMirror method) {
if(method == null) {
return null;
}
return method.parameters;
}
void hello(String name) {
print(name);
}

IOS UIImagePicker with mp4 format

Is it possible to save video and add it to custom ALAsset, captured from UIImagePicker in mp4 format? Or I have to save it in .mov and make compression by AVAssetExportSession?
Yes, you can compress video using AVAssetExportSession. Here you can specify video type, quality and output url for compress video.
See below methods:
- (void) saveVideoToLocal:(NSURL *)videoURL {
#try {
NSArray *documentsDirectory = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *docPath = [documentsDirectory objectAtIndex:0];
NSString *videoName = [NSString stringWithFormat:#"sampleVideo.mp4"];
NSString *videoPath = [docPath stringByAppendingPathComponent:videoName];
NSURL *outputURL = [NSURL fileURLWithPath:videoPath];
NSLog(#"Loading video");
[self convertVideoToLowQuailtyWithInputURL:videoURL outputURL:outputURL handler:^(AVAssetExportSession *exportSession) {
if (exportSession.status == AVAssetExportSessionStatusCompleted) {
NSLog(#"Compression is done");
}
[self performSelectorOnMainThread:#selector(doneCompressing) withObject:nil waitUntilDone:YES];
}];
}
#catch (NSException *exception) {
NSLog(#"Exception :%#",exception.description);
[self performSelectorOnMainThread:#selector(doneCompressing) withObject:nil waitUntilDone:YES];
}
}
//---------------------------------------------------------------
- (void)convertVideoToLowQuailtyWithInputURL:(NSURL*)inputURL outputURL:(NSURL*)outputURL handler:(void (^)(AVAssetExportSession*))handler {
[[NSFileManager defaultManager] removeItemAtURL:outputURL error:nil];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:inputURL options:nil];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetPassthrough];
exportSession.outputURL = outputURL;
exportSession.outputFileType = AVFileTypeMPEG4;
[exportSession exportAsynchronouslyWithCompletionHandler:^(void) {
handler(exportSession);
}];
}
Here I saved compress video to document directory of application. You can check detail working of this in below sample code:
Sample demo:
func imagePickerController(_ picker: UIImagePickerController, didFinishPickingMediaWithInfo info: [String : Any])
{
picker.dismiss(animated: true, completion: nil)
guard let mediaType = info[UIImagePickerControllerMediaType] as? String else
{
return
}
if mediaType == "public.movie"
{
if let videoURL = info[UIImagePickerControllerMediaURL] as? URL
{
var videoData:Data!
do {
videoData = try Data(contentsOf: videoURL, options: [Data.ReadingOptions.alwaysMapped])
}
catch
{
print(error.localizedDescription)
return
}
if videoData != nil
{
let writePath = URL(fileURLWithPath: NSTemporaryDirectory()).appendingPathComponent("vid1.mp4")
print("writePath - \(writePath)")
do{
try videoData.write(to: writePath)
}catch{
print("Error - \(error.localizedDescription)")
}
}
}
}
}

Resources