Convierta la función C ++ a C #

I am trying to port the following C++ function to C#:

QString Engine::FDigest(const QString & input)
    if(input.size() != 32) return "";

    int idx[] = {0xe, 0x3, 0x6, 0x8, 0x2},
        mul[] = {2, 2, 5, 4, 3},
        add[] = {0x0, 0xd, 0x10, 0xb, 0x5},
        a, m, i, t, v;

    QString b;
    char tmp[2] = { 0, 0 };

    for(int j = 0; j <= 4; j++)
        a = add[j];
        m = mul[j];
        i = idx[j];

        tmp[0] = input[i].toAscii();
        t = a + (int)(strtol(tmp, NULL, 16));
        v = (int)(strtol(input.mid(t, 2).toLocal8Bit(), NULL, 16));

        snprintf(tmp, 2, "%x", (v * m) % 0x10);
        b += tmp;

    return b;

Some of this code is easy to port however I'm having problems with this part:

tmp[0] = input[i].toAscii();
t = a + (int)(strtol(tmp, NULL, 16));
v = (int)(strtol(input.mid(t, 2).toLocal8Bit(), NULL, 16));

snprintf(tmp, 2, "%x", (v * m) % 0x10);

He encontrado que (int)strtol(tmp, NULL, 16) iguales int.Parse(tmp, "x") en C # y snprintf is String.Format, however I'm not sure about the rest of it.

How can I port this fragment to C#?

preguntado el 08 de noviembre de 11 a las 09:11

Have updated my answer with potentially useful info -

I'm assuming QString is a QT class? I added the tag. -

2 Respuestas

Editar I have a suspicion that your code actually does a MD5 digest of the input data. See below for a snippet based on that assumption.

Translation steps

A few hints that should work well1

Q: tmp[0] = input[i].toAscii();

bytes[] ascii = ASCIIEncoding.GetBytes(input);
tmp[0] = ascii[i];

Q: t = a + (int)(strtol(tmp, NULL, 16));

t = a + int.Parse(string.Format("{0}{1}", tmp[0], tmp[1]),

Q: v = (int)(strtol(input.mid(t, 2).toLocal8Bit(), NULL, 16));

No clue about the toLocal8bit, would need to read Qt documentation...

Q: snprintf(tmp, 2, "%x", (v * m) % 0x10);

    string tmptext = ((v*m % 16)).ToString("X2");
    tmp[0] = tmptext[0];
    tmp[1] = tmptext[1];

What if ... it's just MD5?

You could try this directly to see whether it achieves what you need:

using System;

public string FDigest(string input)
   MD5 md5 = System.Security.Cryptography.MD5.Create();
   byte[] ascii = System.Text.Encoding.ASCII.GetBytes (input);
   byte[] hash  = md5.ComputeHash (ascii);

   // Convert the byte array to hexadecimal string
   StringBuilder sb = new StringBuilder();
   for (int i = 0; i < hash.Length; i++)
       sb.Append (hash[i].ToString ("X2")); // "x2" for lowercase
   return sb.ToString();

1 explicitly not optimized, intended as quick hints; optimize as necessary

respondido 08 nov., 11:18

I added an md5 implementation using System.Security since I have a suspicion (after googling the original code) that it podría be MD5. Note: this is nothing more than a suspicion (I don't readily recognize the code for hash functions)) - sehe

for sure its not MD5 since im computing MD5 before this function and input of this function is MD5 hash, but thanks for some hints. - Cfaniak

Can we get rid of the tiny text please? - Sabueso de seguridad

A few more hints:

t is a two byte buffer and you only ever write to the first byte, leaving a trailing nul. So t is always a string of exactly one character, and you're processing a hex number one character at a time. So I think

tmp[0] = input[i].toAscii();
t = a + (int)(strtol(tmp, NULL, 16));

this is roughly int t = a + Convert.ToInt32(input.substring(i, 1), 16); - take one digit from input and add its hex value to a which you've looked up from a table. (I'm assuming that the toAscii is simply to map the QString character which is already a hex digit into ASCII for strtol, so if you have a string of hex digits already this is OK.)

Página siguiente

v = (int)(strtol(input.mid(t, 2).toLocal8Bit(), NULL, 16));

this means look up two characters from input from offset t, i.e. input.substring(t, 2), then convert these to a hex integer again. v = Convert.ToInt32(input.substring(t, 2), 16); Now, as it happens, I think you'll only actually use the second digit here anyway since the calculation is (v * a) % 0x10, but hey. If again we're working with a QString of hex digits then toLocal8Bit ought to be the same conversion as toAscii - I'm not clear why your code has two different functions here.

Finally convert these values to a single digit in tmp, then append that to b

snprintf(tmp, 2, "%x", (v * m) % 0x10);
b += tmp;

(2 is the length of the buffer, and since we need a trailing nul only 1 is ever written) i.e.

int digit = (v * m) % 0x10;
b += digit.ToString("x");

should do. I'd personally write the mod 16 as a logical and, & 0xf, since it's intended to strip the value down to a single digit.

Note also that in your code i is never set - I guess that's a loop or something you omitted for brevity?

Entonces, en resumen

int t = a + Convert.ToInt32(input.substring(i, 1), 16);
int v = Convert.ToInt32(input.substring(t, 2), 16);
int nextDigit = (v * m) & 0xf;
b += nextDigit.ToString("x");

respondido 08 nov., 11:18

No es la respuesta que estás buscando? Examinar otras preguntas etiquetadas or haz tu propia pregunta.