## I would like to have this decoded

Hex (512 bytes): 3c 3f 78 6d 6c 20 76 65 72 73 69 6f 6e 3d 22 31 2e 30 22 20 65 6e 63 6f 64 69 6e 67 3d 22 75 74 66 2d 38 22 3f 3e 0a 3c 21 44 4f 43 54 59 50 45 20 73 76 67 20 50 55 42 4c 49 43 20 22 2d 2f 2f 57 33 43 2f 2f 44 54 44 20 53 56 47 20 31 2e 31 2f 2f 45 4e 22 20 22 68 74 74 70 3a 2f 2f 77 77 77 2e 77 33 2e 6f 72 67 2f 47 72 61 70 68 69 63 73 2f 53 56 47 2f 31 2e 31 2f 44 54 44 2f 73 76 …

## android – Error: Decoded digest length 48 does not match expected length for SHA-256 of 32

Estoy poniendo un archivo network_security_config.xml, en el pongo la sha256 al que quiero conectarme, pero obtengo un error de longitud.
alguien podria ayudarme?

Este es mi xml.

<?xml version="1.0" encoding="utf-8"?>
<network-security-config>
<base-config cleartextTrafficPermitted="true">
<trust-anchors>
<certificates src="https://es.stackoverflow.com/@raw/server_cert" />
</trust-anchors>
</base-config>
<domain-config>
<domain includeSubdomains="true">https:ejemplo.com</domain>
<trust-anchors>
<certificates src="https://es.stackoverflow.com/@raw/server_cert"/>
</trust-anchors>
<pin-set expiration="2021-12-31">
<pin digest="SHA-256">9c8kf8k7ee2bkb9dk81d2e5fofc2ke0c3c88o2k362fed23ko671f4732k113o3o=</pin>
</pin-set>
</domain-config>
</network-security-config>


pero cuando compilo me arroja el siguiente eeror

“C:UserAndroidplatformsandroidappsrcmainresxmlnetwork_security_config.xml:14: Error: Decoded digest length 48 does not match expected length for SHA-256 of 32 [NetworkSecurityConfig]
9c8kf8k7ee2bkb9dk81d2e5fofc2ke0c3c88o2k362fed23ko671f4732k113o3o
^

## mining theory – Why does decoded block data not match coinbase transaction input?

I am looking at the specific block at height 680175.

Via bitcoin-cli getblock 00000000000000000004dbd66fa71fdcd62658bf8c8e2e153521257ad5858c71 0 I obtained the serialized, hex-encoded block data. According to the section Serialized Blocks the txn_count starts at Byte #81. Following the description of CompactSize Unsigned Integers I get:

In (115): int.from_bytes(byte_arr(80:81), "little", signed=False)
Out(115): 253

In (116): tnx_count = byte_arr(81:83)
...: tnx_count = int.from_bytes(tnx_count, "little", signed=False)

In (117): tnx_count
Out(117): 1702


There are indeed 1702 transactions which can be verified via ./bitcoin-cli getblock 00000000000000000004dbd66fa71fdcd62658bf8c8e2e153521257ad5858c71 2 in the attribute nTx.

Now I know that the rest of the block data is the transactions’ part. According to the section Raw Transaction Format the first 4 bytes are the version:

In (118): tnx_part = byte_arr(83:)

In (119): version = tnx_part(:4)

In (120): int.from_bytes(version, "little", signed=False)
Out(120): 1


This seems to be correct, too. So, the next bytes determine the number of tx_in count, but there I get 0 which is false. And the input of the coinbase transaction does not start with a 32-byte null:

In (121): int.from_bytes(data(4:5), "little", signed=False)
Out(121): 0

In (122): data(4:10)
Out(122): bytearray(b'x00x01x01x00x00x00')


Am I missing something? The input of the coinbase transaction seems not to match the description.

## python 2.7 – phyton 2.7, No JSON object could be decoded

intento obtener datos de un json a travez de un proxy web php “http://circuitec.com.br/errors.php” , y me da como error ValueError: No JSON object could be decoded en esta area: data = json.load(response)

este seria un ejemplo del codigo que intento utilizar y me da error

def ejemplo():

USER_AGENT = “Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/75.0.3730.0 Safari/537.36”

headers = {‘User-Agent’: USER_AGENT, ‘Referer’: ‘http://ip-api.com’ }

request = urllib2.Request(‘http://circuitec.com.br/errors.php?q=http%3A%2F%2Fip-api.com%2Fjson%2F&hl=20’, None, headers)

response = urllib2.urlopen(request)

device_location = data(‘country’)

device_ip = data(‘query’)

Alguien que por favor me pueda ayuda, a corregir el codigo, soy novata

## computer vision – What is the function of the Program-Counter and how is it affected after an instruction is decoded?

Thanks for contributing an answer to Computer Science Stack Exchange!

• Please be sure to answer the question. Provide details and share your research!

But avoid

• Asking for help, clarification, or responding to other answers.
• Making statements based on opinion; back them up with references or personal experience.

Use MathJax to format equations. MathJax reference.

To learn more, see our tips on writing great answers.

## encoding – Decoding UUID4 (version 4 variant 1) given a minimum number of decoded pairs

Given a bunch of pairs like these:

1. My – UUID number 1
2. secret – UUID number 2
3. is – UUID number 3

Decode the next word given only the UUID of the message.

I spent yesterday checking out the web for hints to help me understand if and how is possible to decode some future UUID encoded with the same algorithm of past UUIDs. Considering the case of UUID4 variant 1, which is encoded using a pseudo-random generated number, my understanding is that I should create an algorithm to compare the different pairs in order to figure out what’s the random number used. However, I would find useful some examples and or literature that can take me to the right path, because I feel lost in the rabbit hole.

## raw data – How is the whitepaper decoded from the blockchain (Tx with ~1000x m of n multisig outputs)

This is a fun little puzzle on the blockchain, basically. First, you need to know a little about pdf’s and how they’re structured, which you can find here.

Second, you’ll note from section 3.4.1 that all pdf’s start with this string:

%PDF-


In hex, that is 255044462d. And indeed that is in the very first output in the very first bare multisig pubkey:

<e4cf0200><067daf13>**255044462d**312e340a25c3a4c3bcc3b6c39f0a322030206f626a0a3c3c2f4c656e6774682033203020522f46696c7465722f466c6174654465


I haven’t figured out what the first 8 bytes are for (♦edit: e4cf0200067daf13 = 2x 4byte little Endian "checksums", see @WizardOfOzzie comment below), but the rest of the bare multisig keys (everything in between 1 and 3 OP_CHECKMULTISIG in each output — note the last one is a 1 of 1, so it’s 1 OP_CHECKMULTISIG) are pieces of data for the pdf and they are in order. If you can put all the hex digits of the bare multisig keys into a single file (no whitespaces) called “fromblockchain.hex”, you can run this very simple program to extract the pdf:

contents = open('fromblockchain.hex').read()
bytes = contents(16:).decode('hex')
f = open("bitcoin.pdf")
f.write(bytes)
f.close()


This should create a bitcoin.pdf which is the actual satoshi whitepaper. I’ve tested this and indeed it is the whitepaper. Good to know it’s literally in the blockchain.

Alternatively, if you have bitcoind running on your machine, you can run this python script to grab the bitcoin whitepaper:

import subprocess

# raw = full hex of raw Tx using Bitcoin-cli
raw = subprocess.check_output(("bitcoin-cli", "getrawtransaction", "54e48e5f5c656b26c3bca14a8c95aa583d07ebe84dde3b7dd4a78f4e4186e713"))

outputs = raw.split("0100000000000000")

pdf = ""
for output in outputs(1:-2):
# there are 3 65-byte parts in this that we need
cur = 6
pdf += output(cur:cur+130).decode('hex')
cur += 132
pdf += output(cur:cur+130).decode('hex')
cur += 132
pdf += output(cur:cur+130).decode('hex')

pdf += outputs(-2)(6:-4).decode("hex")
f = open("bitcoin.pdf", "wb")
f.write(pdf(8:))
f.close()


## How a can be decoded as 61 and b as 62…? [closed]

I’m trying to stack overflow of one C program. But I’m confused by encoding or decoding algorithms.

I used ollydbg to observe registers. I tried entering "a" 28 times. And it should insert the last "a" in the EIP register because of strcopy cmd used.

When the last "a" is flowed to EIP it reads as 61.

According to ASCII:

A is 65
B is 66


But the OS reads a as 61.

What is this encoding? I’m very new to encoding.

## .htaccess – Can the Apache SetEnvIf Request_URI variable be decoded in Virtual Host and/or Server config context?

When the following SetEnvIf directive is in a virtual host directory section or in an .htaccess file the Request_URI variable is percent decoded.
But when it is not in a directory section or .htaccess file (i.e. virtual host or server config context) it is not decoded (it remains percent encoded).

What controls/determines this difference? My guess is that this is a byproduct of having to traverse the file system directory tree. Because the file system path is not percent encoded.

Is this configurable such that the Request_URI Apache SetEnvIf variable is percent decoded in the virtual host and/or server config context also?

SetEnvIf Request_URI "^.*access_logger/counters/(.*).gif$" Page_Name_File_Name=$1 log_file log_sql_db validate_cache

Apache/2.4.6 (CentOS)
Apr 2 2020 13:13:23

## tls – Where are field names of decoded human readable X.509 certificates specified?

The ASN.1 module for X.509 certificates as specified in RFC 5912 – Section 14 is as follows:

TBSCertificate  ::=  SEQUENCE  {
version         (0)  Version DEFAULT v1,
serialNumber         CertificateSerialNumber,
signature            AlgorithmIdentifier{SIGNATURE-ALGORITHM,
{SignatureAlgorithms}},
issuer               Name,
validity             Validity,
subject              Name,
subjectPublicKeyInfo SubjectPublicKeyInfo,
... ,
((2:               -- If present, version MUST be v2
issuerUniqueID  (1)  IMPLICIT UniqueIdentifier OPTIONAL,
subjectUniqueID (2)  IMPLICIT UniqueIdentifier OPTIONAL
)),
((3:               -- If present, version MUST be v3 --
extensions      (3)  Extensions{{CertExtensions}} OPTIONAL
)), ... }



The field names are the same in RFC 5280.

The decoded example certificate on the X.509 Wikipedia page however has completely different field names:

Certificate:
Data:
Version: 3 (0x2)
Serial Number:
10:e6:fc:62:b7:41:8a:d5:00:5e:45:b6
Signature Algorithm: sha256WithRSAEncryption
Issuer: C=BE, O=GlobalSign nv-sa, CN=GlobalSign Organization Validation CA - SHA256 - G2
Validity
Not Before: Nov 21 08:00:00 2016 GMT
Not After : Nov 22 07:59:59 2017 GMT
Subject: C=US, ST=California, L=San Francisco, O=Wikimedia Foundation, Inc., CN=*.wikipedia.org
Subject Public Key Info:
Public Key Algorithm: id-ecPublicKey
Public-Key: (256 bit)
pub:
00:c9:22:69:31:8a:d6:6c:ea:da:c3:7f:2c:ac:a5:
af:c0:02:ea:81:cb:65:b9:fd:0c:6d:46:5b:c9:1e:
9d:3b:ef
ASN1 OID: prime256v1
NIST CURVE: P-256
X509v3 extensions:
X509v3 Key Usage: critical
Digital Signature, Key Agreement

...


Signature Algorithm instead of algorithm, X509v3 extensions instead of just extensions.

Since the certificate has version 3, i would assume it doesn’t have anything to do with the version…

Of course i searched for various field names like X509v3 Key Usage or X509v3 CRL Distribution Points but couldn’t find any reference.