Add web-based .packed explorer, updated parser and ghidra untility script

This commit is contained in:
Daniel S. 2023-05-07 21:29:21 +02:00
parent 8e0df74541
commit 58407ecc9f
35 changed files with 3897 additions and 353 deletions

24
scrapper_web/.gitignore vendored Normal file
View file

@ -0,0 +1,24 @@
# Logs
logs
*.log
npm-debug.log*
yarn-debug.log*
yarn-error.log*
pnpm-debug.log*
lerna-debug.log*
node_modules
dist
dist-ssr
*.local
# Editor directories and files
.vscode/*
!.vscode/extensions.json
.idea
.DS_Store
*.suo
*.ntvs*
*.njsproj
*.sln
*.sw?

3
scrapper_web/.vscode/extensions.json vendored Normal file
View file

@ -0,0 +1,3 @@
{
"recommendations": ["svelte.svelte-vscode"]
}

47
scrapper_web/README.md Normal file
View file

@ -0,0 +1,47 @@
# Svelte + Vite
This template should help get you started developing with Svelte in Vite.
## Recommended IDE Setup
[VS Code](https://code.visualstudio.com/) + [Svelte](https://marketplace.visualstudio.com/items?itemName=svelte.svelte-vscode).
## Need an official Svelte framework?
Check out [SvelteKit](https://github.com/sveltejs/kit#readme), which is also powered by Vite. Deploy anywhere with its serverless-first approach and adapt to various platforms, with out of the box support for TypeScript, SCSS, and Less, and easily-added support for mdsvex, GraphQL, PostCSS, Tailwind CSS, and more.
## Technical considerations
**Why use this over SvelteKit?**
- It brings its own routing solution which might not be preferable for some users.
- It is first and foremost a framework that just happens to use Vite under the hood, not a Vite app.
This template contains as little as possible to get started with Vite + Svelte, while taking into account the developer experience with regards to HMR and intellisense. It demonstrates capabilities on par with the other `create-vite` templates and is a good starting point for beginners dipping their toes into a Vite + Svelte project.
Should you later need the extended capabilities and extensibility provided by SvelteKit, the template has been structured similarly to SvelteKit so that it is easy to migrate.
**Why `global.d.ts` instead of `compilerOptions.types` inside `jsconfig.json` or `tsconfig.json`?**
Setting `compilerOptions.types` shuts out all other types not explicitly listed in the configuration. Using triple-slash references keeps the default TypeScript setting of accepting type information from the entire workspace, while also adding `svelte` and `vite/client` type information.
**Why include `.vscode/extensions.json`?**
Other templates indirectly recommend extensions via the README, but this file allows VS Code to prompt the user to install the recommended extension upon opening the project.
**Why enable `checkJs` in the JS template?**
It is likely that most cases of changing variable types in runtime are likely to be accidental, rather than deliberate. This provides advanced typechecking out of the box. Should you like to take advantage of the dynamically-typed nature of JavaScript, it is trivial to change the configuration.
**Why is HMR not preserving my local component state?**
HMR state preservation comes with a number of gotchas! It has been disabled by default in both `svelte-hmr` and `@sveltejs/vite-plugin-svelte` due to its often surprising behavior. You can read the details [here](https://github.com/rixo/svelte-hmr#svelte-hmr).
If you have state that's important to retain within a component, consider creating an external store which would not be replaced by HMR.
```js
// store.js
// An extremely simple external store
import { writable } from 'svelte/store'
export default writable(0)
```

13
scrapper_web/index.html Normal file
View file

@ -0,0 +1,13 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<link rel="icon" type="image/svg+xml" href="/vite.svg" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>Vite + Svelte</title>
</head>
<body>
<div id="app"></div>
<script type="module" src="/src/main.js"></script>
</body>
</html>

View file

@ -0,0 +1,33 @@
{
"compilerOptions": {
"moduleResolution": "Node",
"target": "ESNext",
"module": "ESNext",
/**
* svelte-preprocess cannot figure out whether you have
* a value or a type, so tell TypeScript to enforce using
* `import type` instead of `import` for Types.
*/
"importsNotUsedAsValues": "error",
"isolatedModules": true,
"resolveJsonModule": true,
/**
* To have warnings / errors of the Svelte compiler at the
* correct position, enable source maps by default.
*/
"sourceMap": true,
"esModuleInterop": true,
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true,
/**
* Typecheck JS in `.svelte` and `.js` files by default.
* Disable this if you'd like to use dynamic types.
*/
"checkJs": false
},
/**
* Use global.d.ts instead of compilerOptions.types
* to avoid limiting type declarations.
*/
"include": ["src/**/*.d.ts", "src/**/*.js", "src/**/*.svelte"]
}

26
scrapper_web/package.json Normal file
View file

@ -0,0 +1,26 @@
{
"name": "scrapper_web",
"private": true,
"version": "0.0.0",
"type": "module",
"scripts": {
"dev": "vite",
"build": "wasm-pack build ./scrapper -t web && vite build",
"preview": "vite preview"
},
"devDependencies": {
"@sveltejs/vite-plugin-svelte": "^2.0.2",
"@tailwindcss/forms": "^0.5.3",
"autoprefixer": "^10.4.13",
"cssnano": "^5.1.14",
"cssnano-preset-advanced": "^5.3.9",
"daisyui": "^2.50.0",
"filedrop-svelte": "^0.1.2",
"postcss": "^8.4.21",
"svelte": "^3.55.1",
"svelte-preprocess": "^5.0.1",
"tailwindcss": "^3.2.4",
"vite": "^4.1.0",
"vite-plugin-wasm-pack": "^0.1.12"
}
}

1777
scrapper_web/pnpm-lock.yaml Normal file

File diff suppressed because it is too large Load diff

View file

@ -0,0 +1,11 @@
let cssnano_plugin = {};
if (process.env.NODE_ENV === "production") {
cssnano_plugin = { cssnano: { preset: "advanced" } };
}
module.exports = {
plugins: {
tailwindcss: {},
autoprefixer: {},
...cssnano_plugin,
},
};

View file

@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" aria-hidden="true" role="img" class="iconify iconify--logos" width="31.88" height="32" preserveAspectRatio="xMidYMid meet" viewBox="0 0 256 257"><defs><linearGradient id="IconifyId1813088fe1fbc01fb466" x1="-.828%" x2="57.636%" y1="7.652%" y2="78.411%"><stop offset="0%" stop-color="#41D1FF"></stop><stop offset="100%" stop-color="#BD34FE"></stop></linearGradient><linearGradient id="IconifyId1813088fe1fbc01fb467" x1="43.376%" x2="50.316%" y1="2.242%" y2="89.03%"><stop offset="0%" stop-color="#FFEA83"></stop><stop offset="8.333%" stop-color="#FFDD35"></stop><stop offset="100%" stop-color="#FFA800"></stop></linearGradient></defs><path fill="url(#IconifyId1813088fe1fbc01fb466)" d="M255.153 37.938L134.897 252.976c-2.483 4.44-8.862 4.466-11.382.048L.875 37.958c-2.746-4.814 1.371-10.646 6.827-9.67l120.385 21.517a6.537 6.537 0 0 0 2.322-.004l117.867-21.483c5.438-.991 9.574 4.796 6.877 9.62Z"></path><path fill="url(#IconifyId1813088fe1fbc01fb467)" d="M185.432.063L96.44 17.501a3.268 3.268 0 0 0-2.634 3.014l-5.474 92.456a3.268 3.268 0 0 0 3.997 3.378l24.777-5.718c2.318-.535 4.413 1.507 3.936 3.838l-7.361 36.047c-.495 2.426 1.782 4.5 4.151 3.78l15.304-4.649c2.372-.72 4.652 1.36 4.15 3.788l-11.698 56.621c-.732 3.542 3.979 5.473 5.943 2.437l1.313-2.028l72.516-144.72c1.215-2.423-.88-5.186-3.54-4.672l-25.505 4.922c-2.396.462-4.435-1.77-3.759-4.114l16.646-57.705c.677-2.35-1.37-4.583-3.769-4.113Z"></path></svg>

After

Width:  |  Height:  |  Size: 1.5 KiB

14
scrapper_web/scrapper/.gitignore vendored Normal file
View file

@ -0,0 +1,14 @@
# Generated by Cargo
# will have compiled files and executables
debug/
target/
# Remove Cargo.lock from gitignore if creating an executable, leave it for libraries
# More information here https://doc.rust-lang.org/cargo/guide/cargo-toml-vs-cargo-lock.html
Cargo.lock
# These are backup files generated by rustfmt
**/*.rs.bk
# MSVC Windows builds of rustc generate these, which store debugging information
*.pdb

View file

@ -0,0 +1,31 @@
[package]
name = "scrapper"
version = "0.1.0"
authors = []
edition = "2021"
[lib]
crate-type = ["cdylib", "rlib"]
[profile.release]
lto = true
# See more keys and their definitions at https://doc.rust-lang.org/cargo/reference/manifest.html
[dependencies]
aes = "0.8.2"
anyhow = "1.0.69"
binrw = "0.11.1"
cbc = "0.1.2"
console_error_panic_hook = "0.1.7"
derivative = "2.2.0"
js-sys = "0.3.61"
pelite = "0.10.0"
serde = { version = "1.0.152", features = ["derive"] }
serde-wasm-bindgen = "0.4.5"
wasm-bindgen = "0.2.83"
wasm-bindgen-file-reader = "1.0.0"
web-sys = { version = "0.3.61", features = ["File", "BlobPropertyBag", "Blob", "Url"] }
[package.metadata.wasm-pack.profile.release]
wasm-opt = ["-O4"]

View file

@ -0,0 +1,23 @@
# scrapper
## Usage
[rsw-rs doc](https://github.com/lencx/rsw-rs)
```bash
# install rsw
cargo install rsw
# --- help ---
# rsw help
rsw -h
# new help
rsw new -h
# --- usage ---
# dev
rsw watch
# production
rsw build
```

View file

@ -0,0 +1,155 @@
use binrw::{binread, BinReaderExt};
use serde::Serialize;
use std::collections::BTreeMap;
use std::io::{Read, Seek, SeekFrom};
use wasm_bindgen::prelude::*;
use wasm_bindgen_file_reader::WebSysFile;
use web_sys::{Blob, File};
type JsResult<T> = Result<T,JsValue>;
#[binread]
#[derive(Serialize, Debug)]
struct ScrapFile {
#[br(temp)]
name_len: u32,
#[br(count = name_len)]
#[br(map = |s: Vec<u8>| String::from_utf8_lossy(&s).to_string())]
path: String,
size: u32,
offset: u32,
}
#[binread]
#[br(magic = b"BFPK", little)]
#[derive(Serialize, Debug)]
struct PackedHeader {
version: u32,
#[br(temp)]
num_files: u32,
#[br(count= num_files)]
files: Vec<ScrapFile>,
}
#[derive(Serialize, Debug)]
#[serde(tag = "type", rename_all = "snake_case")]
enum DirectoryTree {
File {
size: u32,
offset: u32,
file_index: u8,
},
Directory {
entries: BTreeMap<String, DirectoryTree>,
},
}
#[wasm_bindgen(inspectable)]
pub struct MultiPack {
files: Vec<(String,WebSysFile)>,
tree: DirectoryTree,
}
fn blob_url(buffer: &[u8]) -> JsResult<String> {
let uint8arr =
js_sys::Uint8Array::new(&unsafe { js_sys::Uint8Array::view(buffer) }.into());
let array = js_sys::Array::new();
array.push(&uint8arr.buffer());
let blob = Blob::new_with_u8_array_sequence_and_options(
&array,
web_sys::BlobPropertyBag::new().type_("application/octet-stream"),
)
.unwrap();
web_sys::Url::create_object_url_with_blob(&blob)
}
#[wasm_bindgen]
impl MultiPack {
#[wasm_bindgen(constructor)]
pub fn parse(files: Vec<File>) -> Self {
let mut tree = DirectoryTree::default();
let mut web_files = vec![];
for (file_index, file) in files.into_iter().enumerate() {
let file_name = file.name();
let mut fh = WebSysFile::new(file);
let header = fh.read_le::<PackedHeader>().unwrap();
tree.merge(&header.files, file_index.try_into().unwrap());
web_files.push((file_name,fh));
}
Self {
tree,
files: web_files,
}
}
#[wasm_bindgen]
pub fn tree(&self) -> JsValue {
serde_wasm_bindgen::to_value(&self.tree).unwrap()
}
#[wasm_bindgen]
pub fn download(
&mut self,
file_index: u8,
offset: u32,
size: u32,
) -> Result<JsValue, JsValue> {
let Some((_,file)) = self.files.get_mut(file_index as usize) else {
return Err("File not found".into());
};
let mut buffer = vec![0u8; size as usize];
file.seek(SeekFrom::Start(offset as u64))
.map_err(|e| format!("Failed to seek file: {e}"))?;
file.read(&mut buffer)
.map_err(|e| format!("Failed to read from file: {e}"))?;
Ok(blob_url(&buffer)?.into())
}
}
impl Default for DirectoryTree {
fn default() -> Self {
Self::Directory {
entries: Default::default(),
}
}
}
impl DirectoryTree {
fn add_child(&mut self, name: &str, node: Self) -> &mut Self {
match self {
Self::File { .. } => panic!("Can't add child to file!"),
Self::Directory {
entries
} => entries.entry(name.to_owned()).or_insert(node),
}
}
fn merge(&mut self, files: &[ScrapFile], file_index: u8) {
for file in files {
let mut folder = &mut *self;
let path: Vec<_> = file.path.split('/').collect();
if let Some((filename, path)) = path.as_slice().split_last() {
for part in path {
let DirectoryTree::Directory { entries } = folder else {
unreachable!();
};
folder = entries.entry(part.to_string()).or_default();
}
folder.add_child(
filename,
DirectoryTree::File {
size: file.size,
offset: file.offset,
file_index,
},
);
}
}
}
}
#[wasm_bindgen(start)]
pub fn main() -> Result<(), JsValue> {
console_error_panic_hook::set_once();
Ok(())
}

View file

@ -0,0 +1,13 @@
<script>
import Explorer from "./lib/Explorer.svelte";
</script>
<main>
<div>
<h1>Scrapland .packed explorer</h1>
<Explorer />
</div>
</main>
<style>
</style>

109
scrapper_web/src/app.pcss Normal file
View file

@ -0,0 +1,109 @@
:root {
font-family: Inter, system-ui, Avenir, Helvetica, Arial, sans-serif;
line-height: 1.5;
font-weight: 400;
color-scheme: light dark;
color: rgba(255, 255, 255, 0.87);
background-color: #242424;
font-synthesis: none;
text-rendering: optimizeLegibility;
-webkit-font-smoothing: antialiased;
-moz-osx-font-smoothing: grayscale;
-webkit-text-size-adjust: 100%;
}
a {
font-weight: 500;
color: #646cff;
text-decoration: inherit;
}
a:hover {
color: #535bf2;
}
body {
margin: 0;
display: flex;
place-items: center;
min-width: 320px;
min-height: 100vh;
}
h1 {
font-size: 3.2em;
line-height: 1.1;
}
.card {
padding: 2em;
}
#app {
max-width: 1280px;
margin: 0 auto;
padding: 2rem;
text-align: center;
}
li {
text-align: left;
}
button {
border-radius: 8px;
border: 1px solid transparent;
padding: 0.6em 1.2em;
font-size: 1em;
font-weight: 500;
font-family: inherit;
background-color: #1a1a1a;
cursor: pointer;
transition: border-color 0.25s;
}
button:hover {
border-color: #646cff;
}
button:focus,
button:focus-visible {
outline: 4px auto -webkit-focus-ring-color;
}
@media (prefers-color-scheme: light) {
:root {
color: #213547;
background-color: #ffffff;
}
a:hover {
color: #747bff;
}
button {
background-color: #f9f9f9;
}
}
.lds-dual-ring {
display: inline-block;
width: 80px;
height: 80px;
}
.lds-dual-ring:after {
content: " ";
display: block;
width: 64px;
height: 64px;
margin: 8px;
border-radius: 50%;
border: 6px solid #fff;
border-color: #fff transparent #fff transparent;
animation: lds-dual-ring 1.2s linear infinite;
}
@keyframes lds-dual-ring {
0% {
transform: rotate(0deg);
}
100% {
transform: rotate(360deg);
}
}

View file

@ -0,0 +1,52 @@
<script>
import { onMount } from "svelte";
import TreeView from "./TreeView.svelte";
import ScrapWorker from "../scrapper.worker?worker";
let worker;
let tree;
let busy;
busy = false;
onMount(async () => {
worker = new ScrapWorker();
worker.onmessage = (msg) => {
console.log({ msg });
if (msg.data) {
if (msg.data.parse) {
tree = msg.data.parse;
busy = false;
}
if (msg.data.download) {
let [file_name, url] = msg.data.download;
let dl = document.createElement("a");
dl.href = url;
dl.download = file_name;
dl.click();
}
}
};
});
let files;
function process() {
console.log({ files });
busy = true;
worker.postMessage({ parse: files });
}
</script>
<div class:lds-dual-ring={busy}>
<input
type="file"
multiple
accept=".packed"
class="file-input file-input-bordered w-full max-w-xs"
disabled={busy}
bind:files
on:change={process}
/>
</div>
{#if tree}
{#each [...tree.entries] as [name, child]}
<TreeView scrap={worker} label={name} tree={child} />
{/each}
{/if}

View file

@ -0,0 +1,56 @@
<script>
export let tree;
export let scrap;
export let label=undefined;
let expanded = false;
function toggleExpansion() {
expanded = !expanded;
};
function download() {
console.log({label,tree});
scrap.postMessage({download:{label,...tree}});
console.log(tree);
}
</script>
<ul>
<li>
{#if tree.type == "directory" && tree.entries}
<span on:click={toggleExpansion} on:keydown={toggleExpansion}>
{#if expanded}
<span class="arrow">[-]</span>
{:else}
<span class="arrow">[+]</span>
{/if}
{label}
</span>
{#if tree.entries && expanded}
{#each [...tree.entries] as [name, child]}
<svelte:self {scrap} label={name} tree={child} />
{/each}
{/if}
{:else}
<span>
<span class="no-arrow" />
<a href="#download" title="{tree.size} bytes" on:click={download}>{label}</a>
</span>
{/if}
</li>
</ul>
<style>
ul {
margin: 0;
list-style: none;
padding-left: 1.2rem;
user-select: none;
}
.no-arrow {
padding-left: 1rem;
}
.arrow {
cursor: pointer;
display: inline-block;
}
</style>

6
scrapper_web/src/main.js Normal file
View file

@ -0,0 +1,6 @@
import './app.pcss'
import App from './App.svelte'
export default new App({
target: document.getElementById('app'),
});

View file

@ -0,0 +1,28 @@
import wasm, { MultiPack } from "scrapper";
async function initialize() {
await wasm();
let pack;
let handlers = {
parse(data) {
pack = new MultiPack(data);
return pack.tree();
},
download(data) {
if (pack) {
let { label, file_index, offset, size } = data;
return [label, pack.download(file_index, offset, size)];
}
},
};
self.onmessage = (event) => {
for (var [name, func] of Object.entries(handlers)) {
let data = event.data[name];
if (data) {
postMessage(Object.fromEntries([[name, func(data)]]));
}
}
};
}
initialize();

2
scrapper_web/src/vite-env.d.ts vendored Normal file
View file

@ -0,0 +1,2 @@
/// <reference types="svelte" />
/// <reference types="vite/client" />

View file

@ -0,0 +1,6 @@
import { vitePreprocess } from '@sveltejs/vite-plugin-svelte'
export default {
// Consult https://svelte.dev/docs#compile-time-svelte-preprocess
// for more information about preprocessors
preprocess: vitePreprocess(),
}

View file

@ -0,0 +1,36 @@
module.exports = {
content: ["./src/**/*.{svelte,js,ts}"],
plugins: [require("@tailwindcss/forms"),require("daisyui")],
theme: {
container: {
center: true,
},
},
daisyui: {
styled: true,
themes: true,
base: true,
utils: true,
logs: true,
rtl: false,
prefix: "",
darkTheme: "scraptool",
themes: [
{
scraptool: {
primary: "#F28C18",
secondary: "#b45309",
accent: "#22d3ee",
neutral: "#1B1D1D",
"base-100": "#212121",
info: "#2463EB",
success: "#16A249",
warning: "#DB7706",
error: "#DC2828",
// "--rounded-box": "0.4rem",
// "--rounded-btn": "0.2rem"
},
},
],
},
};

View file

@ -0,0 +1,10 @@
import { defineConfig } from 'vite'
import { svelte } from '@sveltejs/vite-plugin-svelte'
import wasmPack from 'vite-plugin-wasm-pack';
import preprocess from 'svelte-preprocess';
export default defineConfig({
plugins: [wasmPack("./scrapper/"),svelte({
preprocess: preprocess({ postcss: true })
})]
});

View file

@ -0,0 +1,125 @@
import time
try:
import ghidra_bridge
has_bridge=True
except ImportError:
has_bridge=False
from contextlib import contextmanager
if has_bridge:
import ghidra_bridge
b = ghidra_bridge.GhidraBridge(namespace=globals(), hook_import=True)
@contextmanager
def transaction():
start()
try:
yield
except Exception as e:
end(False)
raise e
end(True)
else:
@contextmanager
def transaction():
yield
import ghidra.program.model.symbol.SymbolType as SymbolType
import ghidra.program.model.symbol.SourceType as SourceType
from ghidra.app.cmd.label import CreateNamespacesCmd
from ghidra.program.model.data.DataUtilities import createData
from ghidra.program.model.data.DataUtilities import ClearDataMode
from ghidra.program.model.listing.CodeUnit import PLATE_COMMENT
listing = currentProgram.getListing()
dtm = currentProgram.getDataTypeManager()
py_mod = dtm.getDataType("/PyModuleDef")
py_meth = dtm.getDataType("/PyMethodDef")
NULL=toAddr(0)
def make_namespace(parts):
ns_cmd = CreateNamespacesCmd("::".join(parts), SourceType.USER_DEFINED)
ns_cmd.applyTo(currentProgram)
return ns_cmd.getNamespace()
def create_data(addr,dtype):
return createData(currentProgram,addr,dtype,0,False,ClearDataMode.CLEAR_ALL_CONFLICT_DATA)
def create_str(addr):
if addr.equals(NULL):
return None
str_len = (findBytes(addr, b"\0").offset - addr.offset) + 1
clearListing(addr, addr.add(str_len))
return createAsciiString(addr)
def get_call_obj(addr):
func = getFunctionContaining(addr)
if func is None:
disassemble(addr)
func = createFunction(addr,None)
call_obj = {"this": None, "stack": []}
for inst in currentProgram.listing.getInstructions(func.body, True):
affected_objs = [r.toString() for r in inst.resultObjects.tolist()]
inst_name = inst.getMnemonicString()
if inst_name == "PUSH":
val=inst.getScalar(0)
if val is not None:
call_obj["stack"].insert(0, toAddr(val.getValue()).toString())
elif inst_name == "MOV" and "ECX" in affected_objs:
this = inst.getScalar(1)
if this is not None:
call_obj["this"] = toAddr(this.getValue()).toString()
elif inst_name == "CALL":
break
func=func.symbol.address
return func, call_obj
def data_to_dict(data):
ret={}
for idx in range(data.dataType.getNumComponents()):
name=data.dataType.getComponent(idx).getFieldName()
value=data.getComponent(idx).getValue()
ret[name]=value
return ret
def try_create_str(addr):
ret=create_str(addr)
if ret:
return ret.getValue()
with transaction():
PyInitModule=getSymbolAt(toAddr("006f31c0"))
for ref in getReferencesTo(PyInitModule.address).tolist():
func,args=get_call_obj(ref.fromAddress)
print(func,args)
module_name=create_str(toAddr(args['stack'][0])).getValue()
methods=toAddr(args['stack'][1])
module_doc=create_str(toAddr(args['stack'][2]))
if module_doc:
module_doc=module_doc.getValue()
print(methods,module_name,module_doc)
mod_ns = make_namespace(["Python", module_name])
createLabel(func, "__init__", mod_ns, True, SourceType.USER_DEFINED)
if module_doc:
listing.getCodeUnitAt(func).setComment(PLATE_COMMENT,module_doc)
while True:
mod_data=data_to_dict(create_data(methods,py_meth))
if mod_data['name'] is None:
clearListing(methods, methods.add(16))
break
mod_data['name']=try_create_str(mod_data['name'])
try:
mod_data['doc']=try_create_str(mod_data['doc'])
except:
mod_data['doc']=None
print(mod_data)
createLabel(mod_data['ml_method'], mod_data['name'], mod_ns, True, SourceType.USER_DEFINED)
if mod_data['doc']:
listing.getCodeUnitAt(mod_data['ml_method']).setComment(PLATE_COMMENT,module_doc)
methods=methods.add(16)
try:
if getBytes(methods,4).tolist()==[0,0,0,0]:
break
except:
break

177
tools/remaster/scrap_parse/.gitignore vendored Normal file
View file

@ -0,0 +1,177 @@
# Generated by Cargo
# will have compiled files and executables
debug/
target/
# Remove Cargo.lock from gitignore if creating an executable, leave it for libraries
# More information here https://doc.rust-lang.org/cargo/guide/cargo-toml-vs-cargo-lock.html
Cargo.lock
# These are backup files generated by rustfmt
**/*.rs.bk
# MSVC Windows builds of rustc generate these, which store debugging information
*.pdb
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class
# C extensions
*.so
# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST
# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.py,cover
.hypothesis/
.pytest_cache/
cover/
# Translations
*.mo
*.pot
# Django stuff:
*.log
local_settings.py
db.sqlite3
db.sqlite3-journal
# Flask stuff:
instance/
.webassets-cache
# Scrapy stuff:
.scrapy
# Sphinx documentation
docs/_build/
# PyBuilder
.pybuilder/
target/
# Jupyter Notebook
.ipynb_checkpoints
# IPython
profile_default/
ipython_config.py
# pyenv
# For a library or package, you might want to ignore these files since the code is
# intended to run in multiple environments; otherwise, check them in:
# .python-version
# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies
# having no cross-platform support, pipenv may install dependencies that don't work, or not
# install all needed dependencies.
#Pipfile.lock
# poetry
# Similar to Pipfile.lock, it is generally recommended to include poetry.lock in version control.
# This is especially recommended for binary packages to ensure reproducibility, and is more
# commonly ignored for libraries.
# https://python-poetry.org/docs/basic-usage/#commit-your-poetrylock-file-to-version-control
#poetry.lock
# pdm
# Similar to Pipfile.lock, it is generally recommended to include pdm.lock in version control.
#pdm.lock
# pdm stores project-wide configurations in .pdm.toml, but it is recommended to not include it
# in version control.
# https://pdm.fming.dev/#use-with-ide
.pdm.toml
# PEP 582; used by e.g. github.com/David-OConnor/pyflow and github.com/pdm-project/pdm
__pypackages__/
# Celery stuff
celerybeat-schedule
celerybeat.pid
# SageMath parsed files
*.sage.py
# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/
# Spyder project settings
.spyderproject
.spyproject
# Rope project settings
.ropeproject
# mkdocs documentation
/site
# mypy
.mypy_cache/
.dmypy.json
dmypy.json
# Pyre type checker
.pyre/
# pytype static type analyzer
.pytype/
# Cython debug symbols
cython_debug/
# PyCharm
# JetBrains specific template is maintained in a separate JetBrains.gitignore that can
# be found at https://github.com/github/gitignore/blob/main/Global/JetBrains.gitignore
# and can be added to the global gitignore or merged into this file. For a more nuclear
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
#.idea/
*.pkl.gz

File diff suppressed because it is too large Load diff

View file

@ -9,13 +9,18 @@ edition = "2021"
anyhow = "1.0.69" anyhow = "1.0.69"
binrw = "0.11.1" binrw = "0.11.1"
chrono = { version = "0.4.23", features = ["serde"] } chrono = { version = "0.4.23", features = ["serde"] }
chrono-humanize = "0.2.2" # chrono-humanize = "0.2.2"
clap = { version = "4.1.6", features = ["derive"] } clap = { version = "4.1.6", features = ["derive"] }
configparser = { version = "3.0.2", features = ["indexmap"] } configparser = { version = "3.0.2", features = ["indexmap"] }
flate2 = "1.0.25" flate2 = "1.0.25"
fs-err = "2.9.0" fs-err = "2.9.0"
indexmap = { version = "1.9.2", features = ["serde"] } indexmap = { version = "1.9.2", features = ["serde"] }
# memmap2 = "0.5.10"
modular-bitfield = "0.11.2" modular-bitfield = "0.11.2"
rhexdump = "0.1.1" rhexdump = "0.1.1"
serde = { version = "1.0.152", features = ["derive"] } serde = { version = "1.0.152", features = ["derive"] }
serde_json = { version = "1.0.93", features = ["unbounded_depth"] } serde-pickle = "1.1.1"
serde_json = { version = "1.0.95", features = ["preserve_order", "unbounded_depth"] }
steamlocate = "1.1.0"
walkdir = "2.3.3"
obj = "0.10.2"

View file

@ -0,0 +1,23 @@
import pickle
import subprocess as SP
from . import packed_browser
from . import level_import
def scrap_bridge(*cmd):
cmd=["scrap_parse",*cmd]
proc=SP.Popen(cmd,stderr=None,stdin=None,stdout=SP.PIPE,shell=True,text=False)
stdout,stderr=proc.communicate()
code=proc.wait()
if code:
raise RuntimeError(str(stderr,"utf8"))
return pickle.loads(stdout)
def register():
packed_browser.register()
level_import.regiser()
def unregister():
packed_browser.unregister()
level_import.unregister()

View file

@ -2,16 +2,16 @@ import bpy
import sys import sys
import os import os
import re import re
import json
import gzip import gzip
import pickle
import argparse import argparse
import shutil
from glob import glob from glob import glob
from mathutils import Vector from mathutils import Vector
from pathlib import Path from pathlib import Path
import numpy as np import numpy as np
import itertools as ITT import itertools as ITT
from pprint import pprint from pprint import pprint
# from .. import scrap_bridge
import bmesh import bmesh
from bpy.props import StringProperty, BoolProperty from bpy.props import StringProperty, BoolProperty
from bpy_extras.io_utils import ImportHelper from bpy_extras.io_utils import ImportHelper
@ -25,12 +25,6 @@ if "--" in sys.argv:
parser.add_argument("file_list", nargs="+") parser.add_argument("file_list", nargs="+")
cmdline = parser.parse_args(args) cmdline = parser.parse_args(args)
def fix_pos(xyz):
x, y, z = xyz
return x, z, y
class ScrapImporter(object): class ScrapImporter(object):
def __init__(self, options): def __init__(self, options):
self.unhandled = set() self.unhandled = set()
@ -39,16 +33,22 @@ class ScrapImporter(object):
self.model_scale = 1000.0 self.model_scale = 1000.0
self.spawn_pos = {} self.spawn_pos = {}
self.objects = {} self.objects = {}
print("Loading", filepath) # print("Loading", filepath)
with gzip.open(filepath, "r") as fh: # scrapland_path=scrap_bridge("find-scrapland")
data = json.load(fh) # print(scrapland_path)
# packed_data=scrap_bridge("parse-packed",scrapland_path)
# print(packed_data)
# get_output(["scrap_parse","parse-file","--stdout",scrapland_path,"levels/temple"])
# raise NotImplementedError()
with gzip.open(filepath, "rb") as fh:
data = pickle.load(fh)
self.path = data.pop("path") self.path = data.pop("path")
self.root = data.pop("root") self.root = data.pop("root")
self.config = data.pop("config") self.config = data.pop("config")
self.dummies = data.pop("dummies")["DUM"]["dummies"] self.dummies = data.pop("dummies")["dummies"]
self.moredummies = data.pop("moredummies") self.moredummies = data.pop("moredummies")
self.emi = data.pop("emi")["EMI"] self.emi = data.pop("emi")
self.sm3 = data.pop("sm3")["SM3"] self.sm3 = data.pop("sm3")
def make_empty(self, name, pos, rot=None): def make_empty(self, name, pos, rot=None):
empty = bpy.data.objects.new(name, None) empty = bpy.data.objects.new(name, None)
@ -119,7 +119,7 @@ class ScrapImporter(object):
bpy.context.scene.collection.objects.link(light) bpy.context.scene.collection.objects.link(light)
def create_nodes(self): def create_nodes(self):
for node in self.sm3["scene"]["nodes"]: for node in self.sm3["scene"].get("nodes",[]):
node_name = node["name"] node_name = node["name"]
node = node.get("content", {}) node = node.get("content", {})
if not node: if not node:
@ -212,6 +212,8 @@ class ScrapImporter(object):
) )
else: else:
folders = ITT.chain([start_folder], start_folder.parents) folders = ITT.chain([start_folder], start_folder.parents)
folders=list(folders)
print(f"Looking for {path} in {folders}")
for folder in folders: for folder in folders:
for suffix in file_extensions: for suffix in file_extensions:
for dds in [".", "dds"]: for dds in [".", "dds"]:
@ -227,7 +229,7 @@ class ScrapImporter(object):
return list(filter(lambda i: (i.type, i.name) == (dtype, name), node.inputs)) return list(filter(lambda i: (i.type, i.name) == (dtype, name), node.inputs))
def build_material(self, mat_key, mat_def): def build_material(self, mat_key, mat_def, map_def):
mat_props = dict(m.groups() for m in re.finditer(r"\(\+(\w+)(?::(\w*))?\)",mat_key)) mat_props = dict(m.groups() for m in re.finditer(r"\(\+(\w+)(?::(\w*))?\)",mat_key))
for k,v in mat_props.items(): for k,v in mat_props.items():
mat_props[k]=v or True mat_props[k]=v or True
@ -260,13 +262,13 @@ class ScrapImporter(object):
"Roughness": 0.0, "Roughness": 0.0,
"Specular": 0.2, "Specular": 0.2,
} }
tex_slots=[ tex_slot_map={
"Base Color", "base": "Base Color",
"Metallic", "metallic":"Metallic",
None, # "Clearcoat" ? env map? "unk_1":None, # "Clearcoat" ? env map?
"Normal", "bump":"Normal",
"Emission" "glow":"Emission"
] }
mat = bpy.data.materials.new(mat_key) mat = bpy.data.materials.new(mat_key)
mat.use_nodes = True mat.use_nodes = True
@ -275,7 +277,13 @@ class ScrapImporter(object):
imgs = {} imgs = {}
animated_textures={} animated_textures={}
is_transparent = True is_transparent = True
for slot,tex in zip(tex_slots,mat_def["maps"]): print(map_def)
if map_def[0]:
print("=== MAP[0]:",self.resolve_path(map_def[0]))
if map_def[2]:
print("=== MAP[2]:",self.resolve_path(map_def[2]))
for slot,tex in mat_def["maps"].items():
slot=tex_slot_map.get(slot)
if (slot is None) and tex: if (slot is None) and tex:
self.unhandled.add(tex["texture"]) self.unhandled.add(tex["texture"])
print(f"Don't know what to do with {tex}") print(f"Don't know what to do with {tex}")
@ -286,9 +294,7 @@ class ScrapImporter(object):
continue continue
tex_name = os.path.basename(tex_file) tex_name = os.path.basename(tex_file)
if ".000." in tex_name: if ".000." in tex_name:
tex_files=glob(tex_file.replace(".000.",".*.")) animated_textures[slot]=len(glob(tex_file.replace(".000.",".*.")))
num_frames=len(tex_files)
animated_textures[slot]=num_frames
mat_props.update(overrides.get(tex_name,{})) mat_props.update(overrides.get(tex_name,{}))
if any( if any(
tex_name.find(fragment) != -1 tex_name.find(fragment) != -1
@ -297,7 +303,7 @@ class ScrapImporter(object):
continue continue
else: else:
is_transparent = False is_transparent = False
imgs[slot]=bpy.data.images.load(tex_file) imgs[slot]=bpy.data.images.load(tex_file,check_existing=True)
for n in nodes: for n in nodes:
nodes.remove(n) nodes.remove(n)
out = nodes.new("ShaderNodeOutputMaterial") out = nodes.new("ShaderNodeOutputMaterial")
@ -311,7 +317,6 @@ class ScrapImporter(object):
settings.update(glass_settings) settings.update(glass_settings)
for name, value in settings.items(): for name, value in settings.items():
shader.inputs[name].default_value = value shader.inputs[name].default_value = value
sockets_used = set()
for socket,img in imgs.items(): for socket,img in imgs.items():
img_node = nodes.new("ShaderNodeTexImage") img_node = nodes.new("ShaderNodeTexImage")
img_node.name = img.name img_node.name = img.name
@ -369,17 +374,20 @@ class ScrapImporter(object):
node_tree.links.new(imgs["Base Color"].outputs["Color"],transp_shader.inputs["Color"]) node_tree.links.new(imgs["Base Color"].outputs["Color"],transp_shader.inputs["Color"])
shader_out=mix_shader.outputs["Shader"] shader_out=mix_shader.outputs["Shader"]
node_tree.links.new(shader_out, out.inputs["Surface"]) node_tree.links.new(shader_out, out.inputs["Surface"])
# try:
# bpy.ops.node.button()
# except:
# pass
return mat return mat
def apply_maps(self, ob, m_mat, m_map): def apply_maps(self, ob, m_mat, m_map):
mat_key, m_mat = m_mat mat_key, m_mat = m_mat
map_key, m_map = m_map # TODO?: MAP map_key, m_map = m_map
if mat_key == 0: if mat_key == 0:
return return
mat_name = m_mat.get("name", f"MAT:{mat_key:08X}") mat_name = m_mat.get("name", f"MAT:{mat_key:08X}")
map_name = f"MAP:{map_key:08X}"
if mat_name not in bpy.data.materials: if mat_name not in bpy.data.materials:
ob.active_material = self.build_material(mat_name, m_mat) ob.active_material = self.build_material(mat_name, m_mat, m_map)
else: else:
ob.active_material = bpy.data.materials[mat_name] ob.active_material = bpy.data.materials[mat_name]
@ -424,17 +432,17 @@ class ScrapImporter(object):
ob = bpy.data.objects.new(name, me) ob = bpy.data.objects.new(name, me)
self.apply_maps(ob, m_mat, m_map) self.apply_maps(ob, m_mat, m_map)
bpy.context.scene.collection.objects.link(ob) bpy.context.scene.collection.objects.link(ob)
self.objects.setdefault(name, []).append(ob) self.objects.setdefault(name.split("(")[0], []).append(ob)
return ob return ob
class Scrap_Load(Operator, ImportHelper): class Scrap_Load(Operator, ImportHelper):
bl_idname = "scrap_utils.import_json" bl_idname = "scrap_utils.import_pickle"
bl_label = "Import JSON" bl_label = "Import Pickle"
filename_ext = ".json.gz" filename_ext = ".pkl.gz"
filter_glob: StringProperty(default="*.json.gz", options={"HIDDEN"}) filter_glob: StringProperty(default="*.pkl.gz", options={"HIDDEN"})
create_dummies: BoolProperty( create_dummies: BoolProperty(
name="Import dummies", name="Import dummies",
@ -461,11 +469,6 @@ class Scrap_Load(Operator, ImportHelper):
default=True default=True
) )
# remove_dup_verts: BoolProperty(
# name="Remove overlapping vertices for smoother meshes",
# default=False
# )
def execute(self, context): def execute(self, context):
bpy.ops.preferences.addon_enable(module = "node_arrange") bpy.ops.preferences.addon_enable(module = "node_arrange")
@ -488,7 +491,7 @@ def unregister():
if __name__ == "__main__": if __name__ == "__main__":
if cmdline is None or not cmdline.file_list: if cmdline is None or not cmdline.file_list:
register() register()
bpy.ops.scrap_utils.import_json("INVOKE_DEFAULT") bpy.ops.scrap_utils.import_pickle("INVOKE_DEFAULT")
else: else:
for file in cmdline.file_list: for file in cmdline.file_list:
bpy.context.preferences.view.show_splash = False bpy.context.preferences.view.show_splash = False

View file

@ -0,0 +1,118 @@
import sys
from .. import scrap_bridge
from bpy.props import (StringProperty, BoolProperty, CollectionProperty,
IntProperty)
bl_info = {
"name": "Packed Archive File",
"blender": (2, 71, 0),
"location": "File &gt; Import",
"description": "Import data from Scrapland .packed Archive",
"category": "Import-Export"}
class ImportFilearchives(bpy.types.Operator):
"""Import whole filearchives directory."""
bl_idname = "import_scene.packed"
bl_label = 'Import Scrapland .packed'
directory = StringProperty(name="'Scrapland' folder",
subtype="DIR_PATH", options={'HIDDEN'})
filter_folder = BoolProperty(default=True, options={'HIDDEN'})
filter_glob = StringProperty(default="", options={'HIDDEN'})
def invoke(self, context, event):
context.window_manager.fileselect_add(self)
return {'RUNNING_MODAL'}
def execute(self, context):
# TODO: Validate filepath
bpy.ops.ui.packed_browser('INVOKE_DEFAULT',filepath=self.directory)
return {'FINISHED'}
class PackedFile(bpy.types.PropertyGroup):
path = bpy.props.StringProperty()
packed_file = bpy.props.StringProperty()
selected = bpy.props.BoolProperty(name="")
offset = bpy.props.IntProperty()
size = bpy.props.IntProperty()
archive = None
class PackedBrowser(bpy.types.Operator):
bl_idname = "ui.packed_browser"
bl_label = "Packed Browser"
bl_options = {'INTERNAL'}
files = CollectionProperty(type=PackedFile)
selected_index = IntProperty(default=0)
def invoke(self, context, event):
scrapland_path=scrap_bridge("find-scrapland")
print(scrapland_path)
packed_data=scrap_bridge("parse-packed",scrapland_path)
print(packed_data)
self.packed_data=packed_data
return context.window_manager.invoke_props_dialog(self)
def draw(self, context):
if self.selected_index != -1:
print("new selected_index: " + str(self.selected_index))
self.files.clear()
for packed_name,files in self.archive:
for file in files:
entry = self.files.add()
entry.packed_file = packed_name
[entry.path,entry.offset,entry.size]=file
self.selected_index = -1
self.layout.template_list("PackedDirList", "", self, "current_dir", self, "selected_index")
def execute(self, context):
print("execute")
return {'FINISHED'}
class PackedDirList(bpy.types.UIList):
def draw_item(self, context, layout, data, item, icon, active_data, active_propname):
operator = data
packed_entry = item
if self.layout_type in {'DEFAULT', 'COMPACT'}:
layout.prop(packed_entry, "name", text="", emboss=False, icon_value=icon)
layout.prop(packed_entry, "selected")
elif self.layout_type in {'GRID'}:
layout.alignment = 'CENTER'
layout.label(text="", icon_value=icon)
def menu_func_import(self, context):
self.layout.operator(ImportFilearchives.bl_idname, text="Scrapland .packed")
classes=[
PackedFile,
PackedDirList,
PackedBrowser,
ImportFilearchives,
]
def register():
for cls in classes:
bpy.utils.regiser_class(cls)
bpy.types.INFO_MT_file_import.append(menu_func_import)
def unregister():
for cls in reversed(classes):
bpy.utils.unregister_class(cls)
bpy.types.INFO_MT_file_import.remove(menu_func_import)
if __name__ == "__main__":
import imp
imp.reload(sys.modules[__name__])
for cls in classes:
bpy.utils.regiser_class(cls)

View file

@ -0,0 +1,66 @@
int _D3DXGetFVFVertexSize(uint fvf)
{
uint uVar1;
uint uVar2;
uint uVar3;
int vert_size;
uVar1 = fvf & 0xe;
vert_size = 0;
if (uVar1 == 2) {
vert_size = 0xc;
}
else if ((uVar1 == 4) || (uVar1 == 6)) {
vert_size = 0x10;
}
else if (uVar1 == 8) {
vert_size = 0x14;
}
else if (uVar1 == 0xa) {
vert_size = 0x18;
}
else if (uVar1 == 0xc) {
vert_size = 0x1c;
}
else if (uVar1 == 0xe) {
vert_size = 0x20;
}
if ((fvf & 0x10) != 0) {
vert_size += 0xc;
}
if ((fvf & 0x20) != 0) {
vert_size += 4;
}
if ((fvf & 0x40) != 0) {
vert_size += 4;
}
if (fvf < '\0') {
vert_size += 4;
}
uVar1 = fvf >> 8 & 0xf;
uVar3 = fvf >> 16;
if (uVar3 == 0) {
vert_size += uVar1 * 8;
}
else {
for (; uVar1 != 0; uVar1 -= 1) {
uVar2 = uVar3 & 3;
if (uVar2 == 0) {
vert_size += 8;
}
else if (uVar2 == 1) {
vert_size += 0xc;
}
else if (uVar2 == 2) {
vert_size += 0x10;
}
else if (uVar2 == 3) {
vert_size += 4;
}
uVar3 >>= 2;
}
}
return vert_size;
}

View file

@ -1,103 +0,0 @@
bl_info = {
"name": "Riot Archive File (RAF)",
"blender": (2, 71, 0),
"location": "File &gt; Import",
"description": "Import LoL data of an Riot Archive File",
"category": "Import-Export"}
import bpy
from io_scene_lolraf import raf_utils
from bpy.props import (StringProperty, BoolProperty, CollectionProperty,
IntProperty)
class ImportFilearchives(bpy.types.Operator):
"""Import whole filearchives directory."""
bl_idname = "import_scene.rafs"
bl_label = 'Import LoL filearchives'
directory = StringProperty(name="'filearchives' folder",
subtype="DIR_PATH", options={'HIDDEN'})
filter_folder = BoolProperty(default=True, options={'HIDDEN'})
filter_glob = StringProperty(default="", options={'HIDDEN'})
def invoke(self, context, event):
context.window_manager.fileselect_add(self)
return {'RUNNING_MODAL'}
def execute(self, context):
# TODO: Validate filepath
bpy.ops.ui.raf_browser('INVOKE_DEFAULT',filepath=self.directory)
return {'FINISHED'}
class RAFEntry(bpy.types.PropertyGroup):
name = bpy.props.StringProperty()
selected = bpy.props.BoolProperty(name="")
archive = None
class RAFBrowser(bpy.types.Operator):
bl_idname = "ui.raf_browser"
bl_label = "RAF-browser"
bl_options = {'INTERNAL'}
filepath = StringProperty()
current_dir = CollectionProperty(type=RAFEntry)
selected_index = IntProperty(default=0)
def invoke(self, context, event):
global archive
archive = raf_utils.RAFArchive(self.filepath)
return context.window_manager.invoke_props_dialog(self)
def draw(self, context):
if self.selected_index != -1:
print("new selected_index: " + str(self.selected_index))
global archive
# TODO: change current directory of archive
self.current_dir.clear()
for dir in archive.current_dir():
entry = self.current_dir.add()
entry.name = dir
self.selected_index = -1
self.layout.template_list("RAFDirList", "", self, "current_dir", self, "selected_index")
def execute(self, context):
print("execute")
return {'FINISHED'}
class RAFDirList(bpy.types.UIList):
def draw_item(self, context, layout, data, item, icon, active_data, active_propname):
operator = data
raf_entry = item
if self.layout_type in {'DEFAULT', 'COMPACT'}:
layout.prop(raf_entry, "name", text="", emboss=False, icon_value=icon)
layout.prop(raf_entry, "selected")
elif self.layout_type in {'GRID'}:
layout.alignment = 'CENTER'
layout.label(text="", icon_value=icon)
def menu_func_import(self, context):
self.layout.operator(ImportFilearchives.bl_idname, text="LoL Filearchives")
def register():
bpy.utils.register_module(__name__)
bpy.types.INFO_MT_file_import.append(menu_func_import)
def unregister():
bpy.utils.unregister_module(__name__)
bpy.types.INFO_MT_file_import.remove(menu_func_import)
if __name__ == "__main__":
import imp
imp.reload(raf_utils)
bpy.utils.register_module(__name__)

View file

@ -0,0 +1,15 @@
use std::path::PathBuf;
use steamlocate::SteamDir;
use anyhow::{bail,Result};
const APP_ID: u32 = 897610;
pub(crate) fn get_executable() -> Result<PathBuf> {
let Some(mut steam) = SteamDir::locate() else {
bail!("Failed to find steam folder");
};
let Some(app) = steam.app(&APP_ID) else {
bail!("App {APP_ID} is not installed!");
};
Ok(app.path.clone())
}

View file

@ -5,6 +5,7 @@ use binrw::prelude::*;
use binrw::until_exclusive; use binrw::until_exclusive;
use chrono::{DateTime, NaiveDateTime, Utc}; use chrono::{DateTime, NaiveDateTime, Utc};
use clap::Parser; use clap::Parser;
use clap::Subcommand;
use configparser::ini::Ini; use configparser::ini::Ini;
use flate2::write::GzEncoder; use flate2::write::GzEncoder;
use flate2::Compression; use flate2::Compression;
@ -15,14 +16,37 @@ use modular_bitfield::specifiers::B2;
use modular_bitfield::specifiers::B4; use modular_bitfield::specifiers::B4;
use modular_bitfield::BitfieldSpecifier; use modular_bitfield::BitfieldSpecifier;
use serde::Serialize; use serde::Serialize;
use serde_json::Map;
use serde_json::Value;
use std::collections::HashMap; use std::collections::HashMap;
use std::fmt::Debug; use std::fmt::Debug;
use std::fs::File; use std::fs::File;
use std::io::{BufReader, Read, Seek}; use std::io::{BufReader, Cursor, Read, Seek};
use std::path::Path; use std::path::Path;
use std::path::PathBuf; use std::path::PathBuf;
use walkdir::WalkDir;
mod find_scrap;
type IniData = IndexMap<String, IndexMap<String, Option<String>>>;
#[binread]
#[derive(Serialize, Debug)]
struct PackedFile{
path: PascalString,
size: u32,
offset: u32
}
#[binread]
#[br(magic = b"BFPK")]
#[derive(Serialize, Debug)]
struct PackedHeader {
#[br(temp,assert(version==0))]
version: u32,
#[br(temp)]
num_files: u32,
#[br(count=num_files)]
files: Vec<PackedFile>,
}
#[binread] #[binread]
#[derive(Serialize, Debug)] #[derive(Serialize, Debug)]
@ -141,6 +165,7 @@ struct IniSection {
#[br(magic = b"INI\0")] #[br(magic = b"INI\0")]
#[derive(Debug)] #[derive(Debug)]
struct INI { struct INI {
#[br(temp)]
size: u32, size: u32,
#[br(temp)] #[br(temp)]
num_sections: u32, num_sections: u32,
@ -153,13 +178,17 @@ impl Serialize for INI {
where where
S: serde::Serializer, S: serde::Serializer,
{ {
use serde::ser::Error;
let blocks: Vec<String> = self let blocks: Vec<String> = self
.sections .sections
.iter() .iter()
.flat_map(|s| s.sections.iter()) .flat_map(|s| s.sections.iter())
.map(|s| s.string.clone()) .map(|s| s.string.clone())
.collect(); .collect();
Ini::new().read(blocks.join("\n")).serialize(serializer) Ini::new()
.read(blocks.join("\n"))
.map_err(Error::custom)?
.serialize(serializer)
} }
} }
@ -227,7 +256,7 @@ enum Pos {
#[repr(u32)] #[repr(u32)]
#[derive(Debug, Serialize, Copy, Clone, PartialEq, Eq, PartialOrd, Ord)] #[derive(Debug, Serialize, Copy, Clone, PartialEq, Eq, PartialOrd, Ord)]
pub struct FVF { pub struct FVF {
reserved_1: bool, reserved: bool,
pos: Pos, pos: Pos,
normal: bool, normal: bool,
point_size: bool, point_size: bool,
@ -267,17 +296,17 @@ impl FVF {
} }
} }
fn num_w(&self) -> usize { // fn num_w(&self) -> usize {
use Pos::*; // use Pos::*;
match self.pos() { // match self.pos() {
XYZ | XYZRHW => 0, // XYZ | XYZRHW => 0,
XYZB1 => 1, // XYZB1 => 1,
XYZB2 => 2, // XYZB2 => 2,
XYZB3 => 3, // XYZB3 => 3,
XYZB4 => 4, // XYZB4 => 4,
XYZB5 => 5, // XYZB5 => 5,
} // }
} // }
} }
fn vertex_size_from_id(fmt_id: u32) -> Result<u32> { fn vertex_size_from_id(fmt_id: u32) -> Result<u32> {
@ -361,6 +390,7 @@ struct MD3D {
tris: Vec<[u16; 3]>, tris: Vec<[u16; 3]>,
mesh_data: LFVF, mesh_data: LFVF,
unk_table_1: RawTable<2>, unk_table_1: RawTable<2>,
rest: Unparsed<0x100>
// TODO: // TODO:
// == // ==
// unk_t1_count: u32, // unk_t1_count: u32,
@ -383,7 +413,7 @@ enum NodeData {
#[br(magic = 0x0u32)] #[br(magic = 0x0u32)]
Null, Null,
#[br(magic = 0xa1_00_00_01_u32)] #[br(magic = 0xa1_00_00_01_u32)]
TriangleMesh, // Empty? TriangleMesh(Unparsed<0x10>), // TODO: Empty or unused?
#[br(magic = 0xa1_00_00_02_u32)] #[br(magic = 0xa1_00_00_02_u32)]
Mesh(MD3D), Mesh(MD3D),
#[br(magic = 0xa2_00_00_04_u32)] #[br(magic = 0xa2_00_00_04_u32)]
@ -393,7 +423,7 @@ enum NodeData {
#[br(magic = 0xa4_00_00_10_u32)] #[br(magic = 0xa4_00_00_10_u32)]
Ground(SUEL), Ground(SUEL),
#[br(magic = 0xa5_00_00_20_u32)] #[br(magic = 0xa5_00_00_20_u32)]
SisPart(Unparsed<0x10>), // TODO: Particles SistPart(Unparsed<0x10>), // TODO: Particles
#[br(magic = 0xa6_00_00_40_u32)] #[br(magic = 0xa6_00_00_40_u32)]
Graphic3D(SPR3), Graphic3D(SPR3),
#[br(magic = 0xa6_00_00_80_u32)] #[br(magic = 0xa6_00_00_80_u32)]
@ -521,6 +551,16 @@ struct MAP {
unk_3: Option<[u8; 0xc]>, unk_3: Option<[u8; 0xc]>,
} }
#[binread]
#[derive(Debug, Serialize)]
struct Textures {
base: Optional<MAP>,
metallic: Optional<MAP>,
unk_1: Optional<MAP>,
bump: Optional<MAP>,
glow: Optional<MAP>
}
#[binread] #[binread]
#[br(magic = b"MAT\0")] #[br(magic = b"MAT\0")]
#[derive(Debug, Serialize)] #[derive(Debug, Serialize)]
@ -532,7 +572,7 @@ struct MAT {
name: Option<PascalString>, name: Option<PascalString>,
unk_f: [RGBA; 7], unk_f: [RGBA; 7],
unk_data: [RGBA; 0x18 / 4], unk_data: [RGBA; 0x18 / 4],
maps: [Optional<MAP>; 5], // Base Color, Metallic?, ???, Normal, Emission maps: Textures
} }
#[binread] #[binread]
@ -556,9 +596,9 @@ struct SCN {
#[br(temp,assert(unk_3==1))] #[br(temp,assert(unk_3==1))]
unk_3: u32, unk_3: u32,
num_nodes: u32, num_nodes: u32,
#[br(count = num_nodes)] // 32 #[br(count = 1)] // 32
nodes: Vec<Node>, nodes: Vec<Node>,
ani: Optional<ANI>, // TODO:? // ani: Optional<ANI>, // TODO: ?
} }
fn convert_timestamp(dt: u32) -> Result<DateTime<Utc>> { fn convert_timestamp(dt: u32) -> Result<DateTime<Utc>> {
@ -682,11 +722,11 @@ struct CM3 {
#[binread] #[binread]
#[derive(Debug, Serialize)] #[derive(Debug, Serialize)]
struct Dummy { struct Dummy {
has_next: u32,
name: PascalString, name: PascalString,
pos: [f32; 3], pos: [f32; 3],
rot: [f32; 3], rot: [f32; 3],
info: Optional<INI>, info: Optional<INI>,
has_next: u32,
} }
#[binread] #[binread]
@ -697,7 +737,6 @@ struct DUM {
#[br(assert(version==1, "Invalid DUM version"))] #[br(assert(version==1, "Invalid DUM version"))]
version: u32, version: u32,
num_dummies: u32, num_dummies: u32,
unk_1: u32,
#[br(count=num_dummies)] #[br(count=num_dummies)]
dummies: Vec<Dummy>, dummies: Vec<Dummy>,
} }
@ -826,13 +865,6 @@ enum Data {
EMI(EMI), EMI(EMI),
} }
#[derive(Parser, Debug)]
#[command(author, version, about, long_about = None)]
struct Args {
root: PathBuf,
path: PathBuf,
}
fn parse_file(path: &PathBuf) -> Result<Data> { fn parse_file(path: &PathBuf) -> Result<Data> {
let mut rest_size = 0; let mut rest_size = 0;
let mut fh = BufReader::new(fs::File::open(path)?); let mut fh = BufReader::new(fs::File::open(path)?);
@ -842,11 +874,11 @@ fn parse_file(path: &PathBuf) -> Result<Data> {
.unwrap_or(0) .unwrap_or(0)
.try_into() .try_into()
.unwrap_or(u32::MAX); .unwrap_or(u32::MAX);
println!("Read {} bytes from {}", pos, path.display()); eprintln!("Read {} bytes from {}", pos, path.display());
let mut buffer = [0u8; 0x1000]; let mut buffer = [0u8; 0x1000];
if let Ok(n) = fh.read(&mut buffer) { if let Ok(n) = fh.read(&mut buffer) {
if n != 0 { if n != 0 {
println!("Rest:\n{}", rhexdump::hexdump_offset(&buffer[..n], pos)); eprintln!("Rest:\n{}", rhexdump::hexdump_offset(&buffer[..n], pos));
} }
}; };
while let Ok(n) = fh.read(&mut buffer) { while let Ok(n) = fh.read(&mut buffer) {
@ -855,52 +887,182 @@ fn parse_file(path: &PathBuf) -> Result<Data> {
} }
rest_size += n; rest_size += n;
} }
println!("+{rest_size} unparsed bytes"); eprintln!("+{rest_size} unparsed bytes");
Ok(ret) Ok(ret)
} }
fn load_ini(path: &PathBuf) -> IndexMap<String, IndexMap<String, Option<String>>> { fn load_ini(path: &PathBuf) -> IniData {
Ini::new().load(path).unwrap_or_default() Ini::new().load(path).unwrap_or_default()
} }
fn load_data(root: &Path, path: &Path) -> Result<Value> { #[derive(Serialize, Debug)]
struct Level {
config: IniData,
moredummies: IniData,
emi: EMI,
sm3: SM3,
dummies: DUM,
path: PathBuf,
root: PathBuf,
}
impl Level {
fn load(root: &Path, path: &Path) -> Result<Self> {
let full_path = &root.join(path); let full_path = &root.join(path);
let emi_path = full_path.join("map").join("map3d.emi"); let emi_path = full_path.join("map").join("map3d.emi");
let sm3_path = emi_path.with_extension("sm3"); let sm3_path = emi_path.with_extension("sm3");
let dum_path = emi_path.with_extension("dum"); let dum_path = emi_path.with_extension("dum");
let config_file = emi_path.with_extension("ini"); let config_file = emi_path.with_extension("ini");
let moredummies = emi_path.with_file_name("moredummies").with_extension("ini"); let moredummies = emi_path.with_file_name("moredummies").with_extension("ini");
let mut data = serde_json::to_value(HashMap::<(), ()>::default())?; let config = load_ini(&config_file);
data["config"] = serde_json::to_value(load_ini(&config_file))?; let moredummies = load_ini(&moredummies);
data["moredummies"] = serde_json::to_value(load_ini(&moredummies))?; let Data::EMI(emi) = parse_file(&emi_path)? else {
data["emi"] = serde_json::to_value(parse_file(&emi_path)?)?; bail!("Failed to parse EMI at {emi_path}", emi_path=emi_path.display());
data["sm3"] = serde_json::to_value(parse_file(&sm3_path)?)?; };
data["dummies"] = serde_json::to_value(parse_file(&dum_path)?)?; let Data::SM3(sm3) = parse_file(&sm3_path)? else {
data["path"] = serde_json::to_value(path)?; bail!("Failed to parse SM3 at {sm3_path}", sm3_path=sm3_path.display());
data["root"] = serde_json::to_value(root)?; };
Ok(data) let Data::DUM(dummies) = parse_file(&dum_path)? else {
bail!("Failed to parse DUM at {dum_path}", dum_path=dum_path.display());
};
Ok(Level {
config,
moredummies,
emi,
sm3,
dummies,
path: path.into(),
root: root.into(),
})
}
} }
fn main() -> Result<()> { #[derive(Subcommand, Debug)]
let args = Args::try_parse()?; enum Commands {
FindScrapland,
ParsePacked {
scrap_path: PathBuf,
},
ParseFile {
#[clap(long)]
/// Write to stdout
stdout: bool,
/// Scrapland root path
root: PathBuf,
/// Level to parse and convert
level: PathBuf,
},
}
#[derive(Parser, Debug)]
#[command(author, version, about, long_about = None)]
#[command(propagate_version = true)]
struct Args {
#[arg(long,short)]
/// Write data as JSON
json: bool,
#[command(subcommand)]
command: Commands,
}
fn cmd_parse_packed(root: &Path) -> Result<HashMap<PathBuf, Vec<PackedFile>>> {
let mut packed_map = HashMap::new();
for entry in WalkDir::new(root).into_iter().filter_map(|e| e.ok()) {
let path = entry.path();
if path
.extension()
.map(|e| e.to_str() == Some("packed"))
.unwrap_or(false)
{
let path = entry.path().to_owned();
let header: PackedHeader = BufReader::new(File::open(&path)?).read_le()?;
packed_map.insert(path, header.files);
}
}
Ok(packed_map)
}
fn to_bytes<T>(data: &T, json: bool) -> Result<Vec<u8>> where T: Serialize {
if json {
Ok(serde_json::to_vec_pretty(data)?)
} else {
Ok(serde_pickle::to_vec(data,Default::default())?)
}
}
fn cmd_parse_file(stdout: bool, root: &Path, path: &Path, json: bool) -> Result<()> {
let out_path = PathBuf::from( let out_path = PathBuf::from(
args.path path
.components() .components()
.last() .last()
.unwrap() .unwrap()
.as_os_str() .as_os_str()
.to_string_lossy() .to_string_lossy()
.into_owned(), .into_owned(),
) );
.with_extension("json.gz"); let out_path = if json {
let full_path = &args.root.join(&args.path); out_path.with_extension("json.gz")
let data = if full_path.is_dir() {
load_data(&args.root, &args.path)?
} else { } else {
serde_json::to_value(parse_file(full_path)?)? out_path.with_extension("pkl.gz")
};
let full_path = &root.join(path);
let data = if full_path.is_dir() {
let level = Level::load(root, path)?;
to_bytes(&level,json)?
} else {
let data = parse_file(full_path)?;
to_bytes(&data,json)?
};
let mut data = Cursor::new(data);
if stdout {
let mut stdout = std::io::stdout().lock();
std::io::copy(&mut data, &mut stdout)?;
} else {
let mut fh = GzEncoder::new(File::create(&out_path)?, Compression::best());
std::io::copy(&mut data, &mut fh)?;
eprintln!("Wrote {path}", path = out_path.display());
}; };
let mut dumpfile = GzEncoder::new(File::create(&out_path)?, Compression::best()); Ok(())
serde_json::to_writer_pretty(&mut dumpfile, &data)?; }
println!("Wrote {path}", path = out_path.display());
fn emi_to_obj(emi: EMI) -> ! {
// let mut obj_data = obj::ObjData::default();
// for mesh in emi.tri {
// for vert in mesh.data.verts_1.inner.map(|d| d.data).unwrap_or_default() {
// obj_data.position.push(vert.xyz);
// obj_data.normal.push(vert.normal.unwrap_or_default());
// obj_data.texture.push(vert.tex_1.unwrap_or_default().0.try_into().unwrap());
// }
// for vert in mesh.data.verts_2.inner.map(|d| d.data).unwrap_or_default() {
// obj_data.position.push(vert.xyz);
// obj_data.normal.push(vert.normal.unwrap_or_default());
// }
// }
todo!("EMI to OBJ converter");
}
fn main() -> Result<()> {
let args = Args::try_parse()?;
match args.command {
Commands::FindScrapland => {
let data = to_bytes(&find_scrap::get_executable()?,args.json)?;
let mut stdout = std::io::stdout().lock();
std::io::copy(&mut &data[..], &mut stdout)?;
}
Commands::ParsePacked { scrap_path } => {
let data = to_bytes(&cmd_parse_packed(&scrap_path)?,args.json)?;
let mut stdout = std::io::stdout().lock();
std::io::copy(&mut &data[..], &mut stdout)?;
}
Commands::ParseFile {
stdout,
root,
level,
} => {
cmd_parse_file(stdout, &root, &level, args.json)?;
}
}
Ok(()) Ok(())
} }

View file

@ -1,3 +1,7 @@
// https://learn.microsoft.com/en-us/windows/win32/direct3dhlsl/dx9-graphics-reference-asm-ps-1-x
//
// ################################################
//
// #[derive(Debug)] // #[derive(Debug)]
// enum VecArg { // enum VecArg {
// Tex(f32,f32,f32,f32), // Tex(f32,f32,f32,f32),