- git-odb
- all docs, sans examples
- Rename pack data/pack index
Kind
toVersion
or similar, because that's what it really is.
- git-object refactor
- split
Id
and everything hash related intogit-id
- use
git-id
inside ofgit-features
, remove cycle
- split
- Documentation (with deny(missing_docs))
- git-features
- git-object
- git-url
- git-ref
- git-packetline
- git-protocol
- git-transport
- git-commitgraph
- git-config
- A complete implementation, writing a the git remote configuration is needed for finalizing the clone
-
Config
type which integrates multiple files into one interface, much like a multi version ofFile
- git-ref
- create ref pointing to ID
- assure to keep the path towards symbolic refs open, and allow specifying if these should be followed or not
- create ref pointing to ID
- git-index
- Create an index from tree
- Checkout index to worktree
- git-repository
- instance for a valid looking repository
- support shallow repos/references
- create-update refs as received from clone/git-receive-pack
- instance for a valid looking repository
- gix clone
- try initializing repo on output path - if so, use that to learn about pack location and place new pack there, allow Repo to create refs somehow.
- _probably this is done using the repository itself, which steers the whole process and injects it's own delegates.
- otherwise create the scaffolding needed for a new repository, probably based on
init
implementation
- try initializing repo on output path - if so, use that to learn about pack location and place new pack there, allow Repo to create refs somehow.
- receive pack
- resolve thin pack with Bundle
- git-repository
- clone from https remote
- multi-db (incorporate object lookup for loose objects and packs)
- single threaded
- optional object cache
- fs-check - verify all object content of a git repository
-
A plumbing command to extract some value from the current implementation, maybe statistics, or verification
-
Application of the command above in a stress test
-
Questions
- What to do with the 'extra-garbage', some code is commented out.
To be picked in any order….
- google-apis-rs PR
- when docker for ARM is available, use it to run x64 images and see if this works for running the toolchain locally as before.
- alternatively, use an INTEL mac for now.
- prodash
- finish transitioning to futures-lite to get rid of futures-util dependency to reduce compile times
- criner
- upgrade to prodash 11.0
- switch to
isahc
orureq
(blocking, but could use unblock for that) seems to allow async-reading of bodies, allowing to get rid of reqwest and tokio. Redirect is configurable.