c++ wcout std::map value - dictionary

I have a std::map called 'prompts' which is declared like this:
std::map<const int, wstring, std::less<int>, std::allocator<std::pair<const int, std::wstring> >> prompts;
and it stores int 'key' and wstring 'value' pairs. If I do this:
wcout << prompts[interpreter->get_state()];
The compiler (vc10) complains
error C2679: binary '<<' : no operator found which takes a right-hand operand of type 'std::basic_string<_Elem,_Traits,_Ax>' (or there is no acceptable conversion)
What do I have to do to get the wstring value returned from the map to print with wcout? Some sort of cast? Or...?

In the first line, you are missing an std::
std::map<const int,std::wstring, std::less<int>, std::allocator<std::pair<const int, std::wstring> >> prompts;
You should write std::wcout instead of wcout.
I just tried this code and it compiles.
#include <map>
#include <iostream>
#include <string>
int main()
{
std::map<const int, std::wstring, std::less<int>, std::allocator<std::pair<const int, std::wstring> >> prompts;
std::wcout << prompts[1];
return 0;
}

Related

yylval undefined with lex and yacc

I was trying a simple program to create an abstract syntax tree using lex and yacc.
My yacc_file.y is
%{
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
typedef struct node
{
struct node *left;
struct node *right;
char *token;
} node;
node *mknode(node *left, node *right, char *token);
void printtree(node *tree);
#define YYSTYPE struct node *
%}
%start lines
%token NUMBER
%token PLUS MINUS TIMES
%token LEFT_PARENTHESIS RIGHT_PARENTHESIS
%token END
%left PLUS MINUS
%left TIMES
%%
lines: /* empty */
| lines line /* do nothing */
line: exp END { printtree($1); printf("\n");}
;
exp : term {$$ = $1;}
| exp PLUS term {$$ = mknode($1, $3, "+");}
| exp MINUS term {$$ = mknode($1, $3, "-");}
;
term : factor {$$ = $1;}
| term TIMES factor {$$ = mknode($1, $3, "*");}
;
factor : NUMBER {$$ = mknode(0,0,(char *)yylval);}
| LEFT_PARENTHESIS exp RIGHT_PARENTHESIS {$$ = $2;}
;
%%
int main (void) {return yyparse ( );}
node *mknode(node *left, node *right, char *token)
{
/* malloc the node */
node *newnode = (node *)malloc(sizeof(node));
char *newstr = (char *)malloc(strlen(token)+1);
strcpy(newstr, token);
newnode->left = left;
newnode->right = right;
newnode->token = newstr;
return(newnode);
}
void printtree(node *tree)
{
int i;
if (tree->left || tree->right)
printf("(");
printf(" %s ", tree->token);
if (tree->left)
printtree(tree->left);
if (tree->right)
printtree(tree->right);
if (tree->left || tree->right)
printf(")");
}
int yyerror (char *s) {fprintf (stderr, "%s\n", s);}
My lex_file.l file is
%{
#include "yacc_file.tab.h"
%}
%%
[0-9]+ {yylval = (int)yytext; return NUMBER;}
/* cast pointer to int for compiler warning */
[ \t\n] ;
"+" return(PLUS);
"-" return(MINUS);
"*" return(TIMES);
"(" return(LEFT_PARENTHESIS);
")" return(RIGHT_PARENTHESIS);
";" return(END);
%%
int yywrap (void) {return 1;}
To run, I have done the following
yacc -d yacc_file.y
lex lex_file.y
cc y.tab.c lex.yy.c -o a.exe
I got the following error
lexfile.l: In function 'yylex':
lex_file.l:10:2: error: 'yylval' undeclared(first used in this function)
[0-9]+ {yylval=(int)yytext; return NUMBER;}
I have searched on google and %union seems to solve the problem. But I am not sure how to use it.
The command
yacc -d yacc_file.y
produces a header file called y.tab.h and a C file called y.tab.c. That's the yacc-compatible default naming, and it does not agree with your flex file, which is expecting the header to be called yacc_file.tab.h.
You could just change the #include statement in your flex file, but that wouldn't be compatible with the build system at your college. So I suggest you change to the command bison -d yacc_file.y instead of your yacc command. That will produce a header file called yacc_file.tab.h and a C file called yacc_file.tab.c. (Of course, you will then have to change the cc command to compile yacc_file.tab.c instead of y.tab.c.)
Presumably there is some incorrect yacc_file.tab.h on your machine, which doesn't include a declaration of yylval. Hence the compilation error.
To avoid confusing yourself further, when you fix your build procedure I'd recommend deleting all the intermediate files -- y.tab.h and y.tab.c as well as yacc_file.tab.c and yacc_file.tab.h, and lex.yy.c. Then you can do a clean build without having to worry about picking up some outdated intermediate file.
Also, in yacc_file.y, you #define YYSTYPE as struct node *. That's fine, but the #define will not be copied into the generated header file; in the header file, YYSTYPE will be #defined as int if there is no other #define before the header file is #included.
Moreover, in lex_file.l you use yylval as though it were an int (yylval = (int)yytext;) but I think that statement does not do what you think it does. What it does is reinterpret the address of yytext as an integer. That's legal but meaningless. What you wanted to do, I think, is to convert the string in yytext as an integer. To do that, you need to use strtod or some similar function from the standard C library.
Regardless, it is vital that the scanner and the parser agree on the type of yylval. Otherwise, things will go desperately wrong.
As you mention, it is possible to use a %union declaration to declare YYSTYPE as a union type. You should make sure you understand C union types, and also read the bison manual section on semantics..

C++: OpenSSL, aes cfb encryption [duplicate]

I tried to implement a "very" simple encryption/decryption example. I need it for a project where I would like to encrypt some user information. I can't encrypt the whole database but only some fields in a table.
The database and most of the rest of the project works, except the encryption:
Here is a simplified version of it:
#include <openssl/aes.h>
#include <openssl/evp.h>
#include <iostream>
#include <string.h>
using namespace std;
int main()
{
/* ckey and ivec are the two 128-bits keys necessary to
en- and recrypt your data. Note that ckey can be
192 or 256 bits as well
*/
unsigned char ckey[] = "helloworldkey";
unsigned char ivec[] = "goodbyworldkey";
int bytes_read;
unsigned char indata[AES_BLOCK_SIZE];
unsigned char outdata[AES_BLOCK_SIZE];
unsigned char decryptdata[AES_BLOCK_SIZE];
/* data structure that contains the key itself */
AES_KEY keyEn;
/* set the encryption key */
AES_set_encrypt_key(ckey, 128, &keyEn);
/* set where on the 128 bit encrypted block to begin encryption*/
int num = 0;
strcpy( (char*)indata , "Hello World" );
bytes_read = sizeof(indata);
AES_cfb128_encrypt(indata, outdata, bytes_read, &keyEn, ivec, &num, AES_ENCRYPT);
cout << "original data:\t" << indata << endl;
cout << "encrypted data:\t" << outdata << endl;
AES_cfb128_encrypt(outdata, decryptdata, bytes_read, &keyEn, ivec, &num, AES_DECRYPT);
cout << "input data was:\t" << decryptdata << endl;
return 0;
}
But the output of "decrypted" data are some random characters, but they are the same after every execution of the code. outdata changes with every execution...
I tried to debug and search for a solution, but I couldn't find any solution for my problem.
Now my question, what is going wrong here? Or do I completely misunderstand the provided functions?
The problem is that AES_cfb128_encrypt modifies the ivec (it has to in order to allow for chaining). Your solution is to create a copy of the ivec and initialize it before each call to AES_cfb128_encrypt as follows:
const char ivecstr[AES_BLOCK_SIZE] = "goodbyworldkey\0";
unsigned char ivec[AES_BLOCK_SIZE];
memcpy( ivec , ivecstr, AES_BLOCK_SIZE);
Then repeat the memcpy before your second call to AES_cfb128_encrypt.
Note 1: Your initial vector was a byte too short, so I put an explicit additional \0 at the end of it. You should make sure all of your strings are of the correct length when copying or passing them.
Note 2: Any code which uses encryption should REALLY avoid using strcpy or any other copy of unchecked length. It's a hazard.

How to shorten std::vector?

Instead of doing this for calloc:
TCHAR *sText = (TCHAR *) calloc(1024, sizeof(TCHAR));
I have this at the top of my C++ file:
#define tcalloc(nCharacters) (TCHAR*)calloc(nCharacters,sizeof(TCHAR))
so I can more easily write this:
TCHAR *sText = tcalloc(1024);
Now, how do I do a shorthand for my std::vector statement? This is my code:
std::vector <TCHAR>sText(1024,0);
maybe
typedef std::vector<TCHAR> tcVec;
#define init_1k_charVector tcVec(1024,0)
int main(int, char**)
{
tcVec sText(1024,0);
tcVec sText2 = init_1k_charVector;
...
}

Failing to convert raw binary/hex to int interpretation

I'm trying to convert raw hex/binary data to different file types.
#include <QByteArray>
#include <QDebug>
int main(int argc, char *argv[])
{
QByteArray package;
package.append( QByteArray::fromHex("a1"));
// "a1" is what is written to the memory, not the string representation of "a1"
qDebug() << package.toHex(); // "a1"
qDebug() << package; // "�"
qDebug() << package.toInt(); // 0
}
Why is the int representation 0 and not 161?
toInt has totally different purpose. It parses string representation of integer. If you want integer representing the value of the first byte of the array, use package[0]. It has char type. I don't remember how qDebug() represents char type, but if you have any problems with it, just static_cast it to unsigned int.
QByteArray::toInt expects that QByteArray contains a string of characters (in ASCII probably), not the binary representation of the number.
If you want to convert binary representation to integer you can use reinterpret_cast:
int i = *reinterpret_cast<quint8*>(package.constData());
Or better use qFromBigEndian/qFromLittleEndian:
int i = qFromLittleEndian<quint8>((const uchar*)package.constData())
In both cases you must know exactly in what format the number is stored and use proper type and endianness.

printf byte to hex string strange output

The following simple code produces strange output:
#include <stdio.h>
#include <string.h>
#include "/tmp/sha.h"
#define DIGEST 64
//taken from coreutils 8.5 - produces 64-byte sha digest and puts it into resblock
extern int sha512_stream(FILE *stream, void *resblock);
int main(int argc, char** argv) {
char sha[DIGEST];
memset(sha, 0, DIGEST);
FILE *stream;
stream = fopen("/bin/less", "r");
sha512_stream(stream, (void *) sha);
fclose(stream);
char buf[2] = {10, 32};
printf("%02x\n", sha[0]);
printf("%02x\n", buf[0]);
return 0;}
Gives the output:
ffffffa9 0a
The first byte of sha is A9, but where are the padding F's coming from?
On Ubuntu Linux 10.10 with gcc 4.4.5.
(char) defaults to (signed char) on Linux x86, and because printf() uses stdarg the (signed char) is being promoted implicitly to (int) resulting in sign extension. You'll need to declare it (unsigned char) to get the expected behavior. (There is no way to pass type information through stdarg, so default promotions are performed on arguments.)

Resources