error in count leading zeros of __int128_t - leading-zero

#include <iostream>
#include <stdint.h>
using namespace std;
typedef __int128_t int128_t;
int main() {
int128_t y = 0;
cout<<__builtin_clzll(static_cast<unsigned long long>(y>>64)) <<endl;
cout<<__builtin_clzll(static_cast<unsigned long long>(0)) <<endl;
}
the output:
63
64
gcc version 5.3.0 20151204
why the results is wrong? is there any other way to count leading zeros of int128_t.

Related

Add new system call at FreeBSD 10.1

I wanna add new system call at FreeBSD. My system call code is:
#include <sys/types.h>
#include <sys/param.h>
#include <sys/systm.h>
#include <sys/kernel.h>
#include <sys/proc.h>
#include <sys/mount.h>
#include <sys/sysproto.h>
int Sum(int a, int b);
int
Sum(a,b)
{
int c;
c = a + b;
return (0);
}
But when I rebuild the kernel, I have an error:
What's wrong? Can you help me?
Thanks a lot.
Here's how I did it with my example system call of setkey which takes two unsigned ints.
I added my system call to the end /kern/syscalls.master
546 AUE_NULL STD { int setkey(unsigned int k0, unsigned int k1);}
Then I did
cd /usr/src
sudo make -C /sys/kern/ sysent
Next, I added the file to /sys/conf/files
kern/sys_setkey.c standard
My sys_setkey.c is as follows
#include <sys/sysproto.h>
#include <sys/proc.h>
//required for printf
#include <sys/types.h>
#include <sys/systm.h>
#ifndef _SYS_SYSPROTO_H_
struct setkey_args {
unsigned int k0;
unsigned int k1;
};
#endif
/* ARGSUSED */
int sys_setkey(struct thread *td, struct setkey_args *args)
{
printf("Hello, Kernel!\n");
return 0;
}
Also, I added the system call to /kern/capabilities.conf
##
## Allow associating SHA1 key with user
##
setkey
Finally, while in /usr/src/ I ran the command
sudo make -j8 kernel
sudo reboot
This is a program which runs the system call
#include <sys/syscall.h>
#include <unistd.h>
#include <stdio.h>
int main(){
//syscall takes syscall.master offset,and the system call arguments
printf("out = %d\n",syscall(546,1,1));
return 0;
}
Please read this
I think, that you haven't included your file with sys_Sum function in kernel makefile ( notice, that in your code, that you have provided, function name is Sum and in error there is call to sys_Sum. I hope, that it's just a typo in your code and the name of function is sys_Sum ).

frama-c malloc Neon-20140301 fatal error

Is it possible to detect memory leaks or double free with Frama-c?
I have tried to test that example But
#include <string.h>
#include <stdlib.h>
#define FRAMA_C_MALLOC_STACK
#include "/usr/share/frama-c/libc/fc_runtime.c"
int main()
{
int *x = malloc(sizeof(int));
free(x);
free(x);
return 0;
}
I get :
Now I am using Version: Neon-20140301 and libc copied from Fluorine-20130601 ( btw why fc_runtime.c and other *.c files are deleted from Neon release ? )
command:
frama-c-gui -cpp-command "gcc -C -E -I/usrhare/frama-c/libc/ -nostdinc" -slevel 1000 -val -val-warn-copy-indeterminate #all main.
Using other defines (FRAMA_C_MALLOC_XXXX) works but is not detecting any bugs.
update:
Other example
#include <string.h>
#include <stdlib.h>
#define FRAMA_C_MALLOC_STACK
#include "/usr/share/frama-c/libc/fc_runtime.c"
int main()
{
int *x = malloc(sizeof(int));
x[2] = 5;
return 0;
}

C program character return incorrect value

This code returns correct pointer address to main function but output is faulty
#include <stdio.h>
#include <stdlib.h>
char* ret();
int main()
{
char *na;
na = ret();
printf("%s \n",na);
return 0;
}
char* ret()
{
char c[15],*p;
scanf("%s",c);
p = c;
return p;
}
This gives wrong output in Code block with GNU GCC compiler.
when i give an input "goods" output produced is goo£Í"

Error declaration prototype with fancy vector

I have a problem, but I don't know what it is. I receive an error when I compile my code (some gnuplot is involved).
#include <iostream>
#include <fstream>
#include <vector>
#include <map>
#include <string>
#include <math.h>
#include "gnuplot_i.hpp"
using namespace std;
typedef struct DATA{
char Label[50]; //title
vector<double> y,SD; //y data point SD sigma
}DATA;
typedef map<int, double> Episode;
typedef map<int, Episode> Stat_run;
double GetAvg(double *Array, int Count, double *stddev);
void wait_for_key();
void plotMyLines(DATA *Data, vector< std::map<int, map<int, double> > > Points, int printsteps, double Y1, double Y2, int episode, int run);
void PlotLines(const char *Outfile, vector<double> x, DATA *Data, int Lines, const string &xlabel, const string &ylabel, double Y1, double Y2);
int main()
{
vector<Stat_run> Points;
Stat_run exp1; Episode eps;
Stat_run exp2; Episode eps2;
}
I removed most of my code. The goal is to format some results to send them to my plotting functions. I receive an error that seems simple, but after 2 hours of test, I can't find where is my problem. Error:
error: expected ‘,’ or ‘...’ before ‘-’ token
I receive this error for the prototype of plotMyLines and PlotLines. Any hints appreciated!
I'm pretty sure this is
not the complete minimal code to show the problem
there is a preprocessor mess-up somewhere:
I can compile it no problem
There is not '-' token anywhere in this snippet of code.
Hint to investigate a preprocessor issue, see the output of preprocessing, e.g.
gcc -E -o test.cpp.ii .... (etc)
You should be able to see exactly what the compiler sees at the lines (scroll all the way down to recognize your own code).
The following compiles like a charm on g++
#include <map>
#include <vector>
#include <string>
using namespace std;
typedef struct DATA{
char Label[50]; //title
vector<double> y,SD; //y data point SD sigma
}DATA;
typedef map<int, double> Episode;
typedef map<int, Episode> Stat_run;
double GetAvg(double *Array, int Count, double *stddev);
void wait_for_key();
void plotMyLines(DATA *Data, vector< std::map<int, map<int, double> > > Points, int printsteps, double Y1, double Y2, int episode, int run);
void PlotLines(const char *Outfile, vector<double> x, DATA *Data, int Lines, const string &xlabel, const string &ylabel, double Y1, double Y2);
int main()
{
vector<Stat_run> Points;
Stat_run exp1; Episode eps;
Stat_run exp2; Episode eps2;
}

how to convert unicode to printable string in QT stream

I'm writing a stream to a file and stdout, but I'm getting some kind of encoding like this:
\u05ea\u05e7\u05dc\u05d9\u05d8
\u05e9\u05e1\u05d9\u05de\u05dc
\u05e9\u05d9\u05e0\u05d5\u05d9
\u05d1\u05e1\u05d2\u05e0\u05d5\u05df
\u05dc\u05d3\u05e2\u05ea\u05d9
\u05d0\u05dd \u05d0\u05e0\u05d9
\u05d6\u05d5\u05db\u05e8
\u05e0\u05db\u05d5\u05df
How can I convert this to a printable string?
I can't figure out how you are printing the string, but that is just Unicode:
#include <QString>
#include <QFile>
#include <QDebug>
int main(int argc, char **argv)
{
QString s = "\u05ea\u05e7\u05dc\u05d9\u05d8 \u05e9\u05e1\u05d9\u05de\u05dc \u05e9\u05d9\u05e0\u05d5\u05d9 \u05d1\u05e1\u05d2\u05e0\u05d5\u05df \u05dc\u05d3\u05e2\u05ea\u05d9 \u05d0\u05dd \u05d0\u05e0\u05d9 \u05d6\u05d5\u05db\u05e8 \u05e0\u05db\u05d5\u05df";
QFile file1("1.txt");
if (!file1.open(QIODevice::WriteOnly | QIODevice::Text))
return 1;
QTextStream out(&file1);
out << s << "\n";
qDebug() << s;
return 0;
}
If I compile and run it
g++ -lQtCore -I /usr/include/QtCore test.cpp
./a.out
I can see the printable characters both in the console debug output and in the file:
"תקליט שסימל שינוי בסגנון לדעתי אם אני זוכר נכון"
So you are probably doing something wrong or looking in the wrong direction, can you paste your code so we can help you better?

Resources