Messing with GIT history destroyed recent commits...how to recover? -


so. i'm new git. , think may have broken beyond ability repair (hooray).

here's breakdown:

i made remote directory, , accidentally included large files in initial commit. made difficult people it. (i didn't realize @ first, , had made modifications, , had been using directory on own awhile)

i learned merely removing files , commiting removal doesn't people cloning directory, , learned modifying history through commands like:

git filter-branch --tree-filter 'rm -f public/vidos/*' head 

which would, knowledge, remove files public/vidos/ directory, , rid of memory of them.

this seemed happen. clone things succesfully (no out of memory errors), , cloned copy didn't have super large files in them.

then, morning, (after stupidly making sure matched remote depository (i.e. got rid of local stuff, thinking should date except large files) started work directory, , realized files seem on initial check in (no modifications, , there lot of things modified, showing up)

i did

git log 

to @ modifications, , see commits (including commits removed super large files directory). did

git reset hashcode 

to rollback appropriate git (with hashcode got log).

except... though thinks i'm @ right commit, files still identical ones first turned in.

i can @ history , see didn't filter-branch on of files unmodified...and i'm confused why changes not there anymore. commited...i pushed... pretty sure remote repository had changes (could check out (would take forever , run out of memory, i'd files) , see changes......but can't see them anymore.

did dumb? messing history things wasn't aware of? entirely impossible things described doing messed repository? (i.e. should looking in entirely different direction?)

i really, want modifications back... replicating code (and remembering did) going difficult. there can do?

edit:

~~~~~~~~~~~ okay, reset didn't work, doing:

git checkout hashcode 

seems work fine, , can see code changes.

but, checkout means i'm not in branch, , can't them commit these changes "recent" (it thinks it's up-to-date). ideas on how can make commit "head"? , once head, rid of branch filtering did rid of super big files? , if does, have advice on how rid of super big files without headache again.

if there no other way, can checkout 2 copies (one @ head, 1 @ last commit) , manually copy , paste files head, commit? seems work, not clean.

basically,

 git checkout hashcode 

does thought git reset would. i'm not sure why changes got rolled when branch filtered, can @ least see code again, manually copied changes clone of head, rechecked changes in again. not elegant solution, got keep filtering , have latest code.

i think part of happened incomplete understanding of git repositories. i'd never managed 1 before, had half-concieved notion git repository "up date", , has latest version of things. when did branch filtering, in git repository, , noticed filters in git status needing commited. did. repository wasn't "up date", , in commiting (and overriding things blithely) overwrote changes , rolled things back.


Comments

Popular posts from this blog

unicode - Are email addresses allowed to contain non-alphanumeric characters? -

C#: Application without a window or taskbar item (background app) that can still use Console.WriteLine() -

c++ - Convert big endian to little endian when reading from a binary file -