-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
No more fs.readFileSync and different hash functions #50
Conversation
- [`getRecursive`](#getrecursive) | ||
- [`getRecursiveStream`](#getrecursivestream) | ||
- [`remove`](#remove) | ||
* [License](#license) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sweeet! Thank you 👍
📜 for the 📜 god
@@ -99,10 +122,12 @@ var node = new ipfsMDAG.DAGNode([<data>, <[links]>]) | |||
|
|||
> (property) an array of `DAGLink`s belonging to the node | |||
|
|||
#### `multihash` | |||
#### `multihash(fn)` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This doesn't expect an actual function, just the hash identifier. Can we use a better name for this arg?
Also, if there is a default. It should be [<fn>]
const mh = require('multihashes') | ||
|
||
const util = require('./util') | ||
const DAGLink = require('./dag-link') | ||
|
||
const proto = protobuf(fs.readFileSync(path.join(__dirname, 'merkledag.proto'))) | ||
const proto = protobuf(require('./merkledag.proto')) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is new.
@@ -157,12 +155,17 @@ module.exports = class DAGNode { | |||
|
|||
// Encoded returns the encoded raw data version of a Node instance. | |||
// It may use a cached encoded version, unless the force flag is given. | |||
encoded (force) { | |||
encoded (fn, force) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is breaking API, (force, fn)
wouldn't.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
okay will change
@@ -1,3 +1,6 @@ | |||
'use strict' | |||
|
|||
module.exports = new Buffer(` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is 'ugly'.. Now I see how require worked.. this doesn't remove complexity, it even adds.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It removes a lot of complexity for users, not using aegir. These proto files are the reason we need to use brfs as a preprocessor everywhere. If we drop it the code works out of the box in webpack and browserify.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this the only fs
that required to be brfs'ified in js-ipfs-api? brfs'ified is a standard process in transpiling code to the browser anyway. This seems to me more of a way to postpone getting proper documentation that explains how to do this. Which we have to write anyway.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The only reason for using brfs for end users today is our use of fs.readFileSync
to load proto files. We can avoid this by simply using a pattern like this. This means reduced build time for us and our users, in addition to having one less thing developers have to worry about. brfs
is nice, but our end goal should be that webpack
and browserify
work on our modules without any additional configuration.
This is the first step for this :)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@dignifiedquire my main point is: we need to have a section in our main projects (js-ipfs, js-ipfs-api and js-libp2p) that explains our users how to 'webpack' all of this stuff, simply, it is better to have a walkthrough, even if it is obvious than having nothing. As an example, I don't believe that js-ipfs will ever stop needing a special webpack config.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we need to have a section in our main projects (js-ipfs, js-ipfs-api and js-libp2p) that explains our users how to 'webpack' all of this stuff,
Yes, but we still should aim to make that explanation as simple as possible :)
I don't believe that js-ipfs will ever stop needing a special webpack config.
I have concrete plans for this and a roadmap in my head on how to get there, it is coming soon. Dropping fs.readFile
is the first step on the road there.
Closing as this module is deprecated now. Ref #62 |
feat(dag-node): allow for different hash functions
fix(dag-node: remove dependency on fs.readFileSync
Closes #48, #49