refactor(http): Improves base64 encoding/decoding with feature detection#67002
refactor(http): Improves base64 encoding/decoding with feature detection#67002SkyZeroZx wants to merge 4 commits intoangular:mainfrom
Conversation
alan-agius4
left a comment
There was a problem hiding this comment.
LGTM, thanks for this.
packages/common/http/src/util.ts
Outdated
| export function toBase64(buffer: unknown): string { | ||
| const bytes = new Uint8Array(buffer as ArrayBufferLike); |
There was a problem hiding this comment.
This looks off. Since toBase64 can get a Blob would it throw ?
I would very much prefer that this function is correctly typed and that the type assertion is made in the transfer cache.
There was a problem hiding this comment.
I’ve just updated it. It’s weird, but it doesn’t throw an exception if we pass a Blob; it just sets the Uint8Array length to 0.
There was a problem hiding this comment.
Is the Uint8Array array empty if we pass it a blob ?
There was a problem hiding this comment.
If we can pass some like
const myBlob = new Blob(['hello' , 'world'])
const u8 = new Uint8Array(myBlob)
console.log(u8)Use feature detection for `Uint8Array.prototype.toBase64` and `Uint8Array.fromBase64`, falling back to the existing implementation when native support is not available
bd809c2 to
3fe3545
Compare
| [BODY]: | ||
| req.responseType === 'arraybuffer' || req.responseType === 'blob' | ||
| ? toBase64(event.body) | ||
| ? toBase64(event.body as ArrayBufferLike) |
There was a problem hiding this comment.
Is this correct for a blob? If yes we should at least have a test that would show that this is okay.
There was a problem hiding this comment.
I agree, I'll add a test for that and validate it.
There was a problem hiding this comment.
Can you also add a comment, because that type assertion doesn't make it obvious
There was a problem hiding this comment.
Is this correct for a blob? If yes we should at least have a test that would show that this is okay.
I added a fix in this PR because parsing was needed. I tested it with Protocol Buffer, and I was confused by the response because it was always a Uint8Array (which is curious, since it works fine when setting a Blob). That was misleading on my part. This should fix the issue, and I tested both cases ( Blob and Uint8Array ) to ensure consistency
Previously, Blob values were passed to `Uint8Array` this resulted in silently producing an empty array (length = 0) without throwing an error, leading to empty cached data
| // Note: Blob is converted to ArrayBuffer because Uint8Array constructor | ||
| // doesn't accept Blob directly, which would result in an empty array. | ||
| // Type assertion is safe here: when responseType is 'blob', the body is guaranteed to be a Blob | ||
| return from((event.body as Blob).arrayBuffer()).pipe( |
There was a problem hiding this comment.
This is necessary because we need to convert asynchronously, and I couldn't think of a better way without adding many more RxJS operators.
There was a problem hiding this comment.
The classic RxJs trap. There is no reason why you need RxJs here it just bloats the code and makes it harder to read.
There was a problem hiding this comment.
I agree, although it might make sense to leave it as is to avoid changing the test behavior. Additionally, this only runs on the server side
| // Note: Blob is converted to ArrayBuffer because Uint8Array constructor | ||
| // doesn't accept Blob directly, which would result in an empty array. | ||
| // Type assertion is safe here: when responseType is 'blob', the body is guaranteed to be a Blob | ||
| return from((event.body as Blob).arrayBuffer()).pipe( |
There was a problem hiding this comment.
The classic RxJs trap. There is no reason why you need RxJs here it just bloats the code and makes it harder to read.
| concatMap((event: HttpEvent<unknown>) => { | ||
| // Only cache successful HTTP responses. | ||
| if (event instanceof HttpResponse) { | ||
| transferState.set<TransferHttpResponse>(storeKey, { | ||
| [BODY]: | ||
| req.responseType === 'arraybuffer' || req.responseType === 'blob' | ||
| ? toBase64(event.body) | ||
| : event.body, | ||
| [HEADERS]: getFilteredHeaders(event.headers, headersToInclude), | ||
| [STATUS]: event.status, | ||
| [STATUS_TEXT]: event.statusText, | ||
| [REQ_URL]: requestUrl, | ||
| [RESPONSE_TYPE]: req.responseType, | ||
| }); | ||
| const storeInCache = (body: unknown) => { | ||
| transferState.set<TransferHttpResponse>(storeKey, { | ||
| [BODY]: body, | ||
| [HEADERS]: getFilteredHeaders(event.headers, headersToInclude), | ||
| [STATUS]: event.status, | ||
| [STATUS_TEXT]: event.statusText, | ||
| [REQ_URL]: requestUrl, | ||
| [RESPONSE_TYPE]: req.responseType, | ||
| }); | ||
| }; | ||
|
|
||
| if (req.responseType === 'blob') { | ||
| // Convert Blob to ArrayBuffer asynchronously before caching. | ||
| // Note: Blob is converted to ArrayBuffer because Uint8Array constructor | ||
| // doesn't accept Blob directly, which would result in an empty array. | ||
| // Type assertion is safe here: when responseType is 'blob', the body is guaranteed to be a Blob | ||
| return from((event.body as Blob).arrayBuffer()).pipe( | ||
| tap((arrayBuffer) => storeInCache(toBase64(arrayBuffer))), | ||
| concatMap(() => of(event)), | ||
| ); | ||
| } | ||
|
|
||
| // For arraybuffer, convert to base64; for other types (json, text), store as-is. | ||
| // Type assertion is safe here: when responseType is 'arraybuffer', the body is | ||
| // guaranteed to be an ArrayBuffer | ||
| const body = | ||
| req.responseType === 'arraybuffer' | ||
| ? toBase64(event.body as ArrayBufferLike) | ||
| : event.body; | ||
| storeInCache(body); | ||
| } | ||
| return of(event); |
There was a problem hiding this comment.
| concatMap((event: HttpEvent<unknown>) => { | |
| // Only cache successful HTTP responses. | |
| if (event instanceof HttpResponse) { | |
| transferState.set<TransferHttpResponse>(storeKey, { | |
| [BODY]: | |
| req.responseType === 'arraybuffer' || req.responseType === 'blob' | |
| ? toBase64(event.body) | |
| : event.body, | |
| [HEADERS]: getFilteredHeaders(event.headers, headersToInclude), | |
| [STATUS]: event.status, | |
| [STATUS_TEXT]: event.statusText, | |
| [REQ_URL]: requestUrl, | |
| [RESPONSE_TYPE]: req.responseType, | |
| }); | |
| const storeInCache = (body: unknown) => { | |
| transferState.set<TransferHttpResponse>(storeKey, { | |
| [BODY]: body, | |
| [HEADERS]: getFilteredHeaders(event.headers, headersToInclude), | |
| [STATUS]: event.status, | |
| [STATUS_TEXT]: event.statusText, | |
| [REQ_URL]: requestUrl, | |
| [RESPONSE_TYPE]: req.responseType, | |
| }); | |
| }; | |
| if (req.responseType === 'blob') { | |
| // Convert Blob to ArrayBuffer asynchronously before caching. | |
| // Note: Blob is converted to ArrayBuffer because Uint8Array constructor | |
| // doesn't accept Blob directly, which would result in an empty array. | |
| // Type assertion is safe here: when responseType is 'blob', the body is guaranteed to be a Blob | |
| return from((event.body as Blob).arrayBuffer()).pipe( | |
| tap((arrayBuffer) => storeInCache(toBase64(arrayBuffer))), | |
| concatMap(() => of(event)), | |
| ); | |
| } | |
| // For arraybuffer, convert to base64; for other types (json, text), store as-is. | |
| // Type assertion is safe here: when responseType is 'arraybuffer', the body is | |
| // guaranteed to be an ArrayBuffer | |
| const body = | |
| req.responseType === 'arraybuffer' | |
| ? toBase64(event.body as ArrayBufferLike) | |
| : event.body; | |
| storeInCache(body); | |
| } | |
| return of(event); | |
| concatMap(async (event: HttpEvent<unknown>) => { | |
| // Only cache successful HTTP responses. | |
| if (event instanceof HttpResponse) { | |
| let body = event.body; | |
| if (req.responseType === 'blob') { | |
| const arrayBuffer = await (event.body as Blob).arrayBuffer(); | |
| body = toBase64(arrayBuffer); | |
| } else if (req.responseType === 'arraybuffer') { | |
| body = toBase64(event.body as ArrayBufferLike); | |
| } | |
| transferState.set<TransferHttpResponse>(storeKey, { | |
| [BODY]: body, | |
| [HEADERS]: getFilteredHeaders(event.headers, headersToInclude), | |
| [STATUS]: event.status, | |
| [STATUS_TEXT]: event.statusText, | |
| [REQ_URL]: requestUrl, | |
| [RESPONSE_TYPE]: req.responseType, | |
| }); | |
| } | |
| return event; |
There was a problem hiding this comment.
I tried something similar if I remember correctly, and I ran into the issue that if we wrap everything with async, our existing tests that validate the flow synchronously will start failing, since now all cases will return a Promise. That means we’d have to update them, and I’m not sure whether that would be considered a breaking change on the user side or maybe not.
Locally, with the suggested change, I got the following errors in around 19 unit tests like :
1) TransferCache withHttpTransferCache should cache with headers
Message:
Error: Expected zero matching requests for criteria "Match URL: /test-1?foo=1", found 1.
2) TransferCache withHttpTransferCache should cache POST with the transferCache option
Message:
Error: Expected zero matching requests for criteria "Match URL: /test-1?foo=1", found 1.
There was a problem hiding this comment.
I just updated with the suggested change and updated the tests so they can pass, given that we will now have asynchronous behavior
Previously, Blob values were passed to `Uint8Array` this resulted in silently producing an empty array (length = 0) without throwing an error, leading to empty cached data
7e21743 to
62f6ea1
Compare
Previously, Blob values were passed to `Uint8Array` this resulted in silently producing an empty array (length = 0) without throwing an error, leading to empty cached data
Use feature detection for
Uint8Array.prototype.toBase64andUint8Array.fromBase64, falling back to the existing implementation when native support is not availableThis is considerably faster when we have sizes like 50kb some like
https://github.com/SkyZeroZx/benchmark-encode-decode-64