-
Notifications
You must be signed in to change notification settings - Fork 1
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
ππ¦ β [SSG-38 SSC-14 SSG-29]:Merge pull request #59 from Signal-K/SSG-38
ππ β [SSG-38 SSC-14 SSG-29]: Content for Chapter 2-3 integration with player pathway
- Loading branch information
Showing
72 changed files
with
1,572 additions
and
5 deletions.
There are no files selected for viewing
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,80 @@ | ||
import os | ||
from supabase import create_client, Client | ||
from pathlib import Path | ||
|
||
# Initialize Supabase client | ||
def init_supabase_client(): | ||
url = "http://127.0.0.1:54321" | ||
key = "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZS1kZW1vIiwicm9sZSI6ImFub24iLCJleHAiOjE5ODM4MTI5OTZ9.CRXP1A7WOeoJeXxjNni43kdQwgnWNReilDMblYTn_I0" | ||
return create_client(url, key) | ||
|
||
def upload_file_to_supabase(supabase: Client, bucket_name: str, file_path: str, destination_path: str): | ||
with open(file_path, "rb") as file: | ||
try: | ||
response = supabase.storage.from_(bucket_name).upload(destination_path, file) | ||
print(f"Uploaded {file_path} -> {destination_path}") | ||
return True | ||
except Exception as e: | ||
if "Duplicate" in str(e): | ||
print(f"File already exists: {file_path}. Proceeding with database insertion.") | ||
return True | ||
print(f"Failed to upload {file_path} -> {destination_path}: {e}") | ||
return False | ||
|
||
def check_anomaly_exists(supabase: Client, anomaly_id): | ||
try: | ||
response = supabase.table('anomalies').select("*").eq("id", anomaly_id).execute() | ||
return len(response.data) > 0 | ||
except Exception as e: | ||
print(f"Error checking for anomaly {anomaly_id}: {e}") | ||
return False | ||
|
||
def insert_into_anomalies(supabase: Client, anomaly_id, content, anomaly_set: str): | ||
if not check_anomaly_exists(supabase, anomaly_id): | ||
try: | ||
data = { | ||
"id": anomaly_id, | ||
"content": content, | ||
"anomalytype": "zoodexOthers", | ||
"anomalySet": 'zoodex-nestQuestGo', | ||
} | ||
response = supabase.table('anomalies').insert(data).execute() | ||
print(f"Inserted anomaly with id {anomaly_id} into 'anomalies' table.") | ||
except Exception as e: | ||
print(f"Failed to insert anomaly {anomaly_id}: {e}") | ||
else: | ||
print(f"Anomaly {anomaly_id} already exists in the database. Skipping insertion.") | ||
|
||
def upload_directory_to_supabase(supabase: Client, bucket_name: str, local_directory: str): | ||
# Iterate over each subfolder in the local directory | ||
for anomaly_folder in os.listdir(local_directory): | ||
full_path = os.path.join(local_directory, anomaly_folder) | ||
if os.path.isdir(full_path): | ||
# Treat each subfolder as an anomaly | ||
anomaly_id = anomaly_folder # Use the folder name as the anomaly ID | ||
anomaly_set = "telescope-minorPlanet" # Set a consistent anomaly set | ||
|
||
# Insert the anomaly into the database | ||
insert_into_anomalies(supabase, anomaly_id, anomaly_id, anomaly_set) | ||
|
||
# Upload all files from the subfolder | ||
for root, _, files in os.walk(full_path): | ||
for file_name in files: | ||
if file_name.startswith('.'): | ||
continue | ||
|
||
file_path = os.path.join(root, file_name) | ||
relative_path = os.path.relpath(file_path, local_directory) | ||
destination_path = f"{anomaly_id}/{Path(relative_path).as_posix()}" | ||
|
||
upload_file_to_supabase(supabase, bucket_name, file_path, destination_path) | ||
|
||
def main(): | ||
supabase = init_supabase_client() | ||
bucket_name = "zoodex/zoodex-nestQuestGo" | ||
local_directory = "zoodex/zoodex-nestQuestGo" | ||
|
||
upload_directory_to_supabase(supabase, bucket_name, local_directory) | ||
|
||
if __name__ == "__main__": | ||
main() |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,73 @@ | ||
Anomaly Type: planet | ||
- ID: 50365310, Content: TIC 50365310 | ||
Classifications: | ||
* ID: 12, Content: this is NOT a test | ||
* ID: 1, Content: Test | ||
* ID: 20, Content: Test | ||
- ID: 6, Content: TOI-700d | ||
Classifications: | ||
* ID: 12, Content: this is NOT a test | ||
* ID: 1, Content: Test | ||
* ID: 20, Content: Test | ||
- ID: 1, Content: Kepler-69c | ||
Classifications: | ||
* ID: 12, Content: this is NOT a test | ||
* ID: 1, Content: Test | ||
* ID: 20, Content: Test | ||
|
||
Anomaly Type: zoodexOthers | ||
- ID: 47335863, Content: 47335863 | ||
Classifications: | ||
* ID: 9, Content: | ||
|
||
Anomaly Type: telescope-minorPlanet | ||
- ID: 100879215, Content: 100879215 | ||
Classifications: | ||
* ID: 19, Content: In the center of the circle | ||
|
||
Anomaly Type: automatonSatellitePhoto | ||
- ID: 79738567, Content: 79738567 | ||
Classifications: | ||
* ID: 18, Content: ano345678tg7uybhkjqwfvertgqwyukehrjvrew7rtfgoyuhtgfvbwer8ioghtgbvaer8iotyhzjgzveb5rilhk | ||
* ID: 17, Content: another testerhguijboknm yaghedrthjnrtgrjnstzrednhjy | ||
* ID: 16, Content: Test | ||
- ID: 57538410, Content: 57538410 | ||
Classifications: | ||
* ID: 18, Content: ano345678tg7uybhkjqwfvertgqwyukehrjvrew7rtfgoyuhtgfvbwer8ioghtgbvaer8iotyhzjgzveb5rilhk | ||
* ID: 17, Content: another testerhguijboknm yaghedrthjnrtgrjnstzrednhjy | ||
* ID: 16, Content: Test | ||
- ID: 57511112, Content: 57511112 | ||
Classifications: | ||
* ID: 18, Content: ano345678tg7uybhkjqwfvertgqwyukehrjvrew7rtfgoyuhtgfvbwer8ioghtgbvaer8iotyhzjgzveb5rilhk | ||
* ID: 17, Content: another testerhguijboknm yaghedrthjnrtgrjnstzrednhjy | ||
* ID: 16, Content: Test | ||
|
||
Anomalies of Type: planet | ||
- ID: 69, Content: Earth | ||
- ID: 50365310, Content: TIC 50365310 | ||
- ID: 65212867, Content: TIC 65212867 | ||
- ID: 88863718, Content: TIC 88863718 | ||
- ID: 106997505, Content: TIC 106997505 | ||
- ID: 124709665, Content: TIC 124709665 | ||
- ID: 156115721, Content: TIC 156115721 | ||
- ID: 169904935, Content: TIC 169904935 | ||
- ID: 238597883, Content: TIC 238597883 | ||
- ID: 440801822, Content: TIC 440801822 | ||
- ID: 3, Content: Kepler-442b | ||
- ID: 4, Content: Kepler-22b | ||
- ID: 5, Content: Trappist-1f | ||
- ID: 6, Content: TOI-700d | ||
- ID: 1, Content: Kepler-69c | ||
- ID: 277039287, Content: 277039287 | ||
- ID: 57299130, Content: 57299130 | ||
- ID: 21720215, Content: 21720215 | ||
- ID: 263723967, Content: 263723967 | ||
- ID: 284300833, Content: 284300833 | ||
- ID: 269343479, Content: 269343479 | ||
- ID: 345724317, Content: 345724317 | ||
- ID: 210904767, Content: 210904767 | ||
- ID: 329981856, Content: 329981856 | ||
- ID: 201175570, Content: 201175570 | ||
- ID: 2, Content: Kepler-186f | ||
- ID: 35, Content: Earth Globe | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,80 @@ | ||
import os | ||
from supabase import create_client, Client | ||
|
||
def init_supabase_client(): | ||
url = "http://127.0.0.1:54321" | ||
key = "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJzdXBhYmFzZS1kZW1vIiwicm9sZSI6ImFub24iLCJleHAiOjE5ODM4MTI5OTZ9.CRXP1A7WOeoJeXxjNni43kdQwgnWNReilDMblYTn_I0" | ||
return create_client(url, key) | ||
|
||
def fetch_anomalies(supabase: Client): | ||
response = supabase.table('anomalies').select("*").execute() | ||
return response.data | ||
|
||
def fetch_classifications(supabase: Client): | ||
response = supabase.table('classifications').select("*").execute() | ||
return response.data | ||
|
||
def group_anomalies_by_type(anomalies, classifications): | ||
# Extract anomaly IDs from classifications | ||
classified_anomaly_ids = {classification['anomaly'] for classification in classifications if classification['anomaly']} | ||
|
||
# Group anomalies by their type and include classifications | ||
grouped_anomalies = {} | ||
for anomaly in anomalies: | ||
anomaly_id = anomaly['id'] | ||
if anomaly_id in classified_anomaly_ids: | ||
anomaly_type = anomaly['anomalytype'] | ||
if anomaly_type not in grouped_anomalies: | ||
grouped_anomalies[anomaly_type] = {'anomalies': [], 'classifications': []} | ||
grouped_anomalies[anomaly_type]['anomalies'].append(anomaly) | ||
# Add all classifications for this anomaly | ||
for classification in classifications: | ||
if classification['anomaly'] == anomaly_id: | ||
grouped_anomalies[anomaly_type]['classifications'].append(classification) | ||
|
||
return grouped_anomalies | ||
|
||
def extract_planet_anomalies(anomalies): | ||
return [anomaly for anomaly in anomalies if anomaly['anomalytype'] == "planet"] | ||
|
||
def export_to_txt(grouped_anomalies, planet_anomalies, filename='anomalies_grouped.txt'): | ||
with open(filename, 'w') as file: | ||
# Write grouped anomalies | ||
for anomaly_type, data in grouped_anomalies.items(): | ||
file.write(f"Anomaly Type: {anomaly_type}\n") | ||
for anomaly in data['anomalies']: | ||
file.write(f" - ID: {anomaly['id']}, Content: {anomaly['content']}\n") | ||
# Write associated classifications | ||
classifications = data['classifications'] | ||
if classifications: | ||
file.write(" Classifications:\n") | ||
for classification in classifications: | ||
file.write(f" * ID: {classification['id']}, Content: {classification['content']}\n") | ||
file.write("\n") | ||
|
||
# Write planet anomalies | ||
file.write("Anomalies of Type: planet\n") | ||
for anomaly in planet_anomalies: | ||
file.write(f" - ID: {anomaly['id']}, Content: {anomaly['content']}\n") | ||
file.write("\n") | ||
|
||
print(f"Data exported to {filename}") | ||
|
||
def main(): | ||
supabase = init_supabase_client() | ||
|
||
# Fetch data from Supabase | ||
anomalies = fetch_anomalies(supabase) | ||
classifications = fetch_classifications(supabase) | ||
|
||
# Group anomalies | ||
grouped_anomalies = group_anomalies_by_type(anomalies, classifications) | ||
|
||
# Extract planet anomalies | ||
planet_anomalies = extract_planet_anomalies(anomalies) | ||
|
||
# Export results to a text file | ||
export_to_txt(grouped_anomalies, planet_anomalies) | ||
|
||
if __name__ == "__main__": | ||
main() |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,25 @@ | ||
#!/bin/bash | ||
|
||
# Run the first Python script to generate anomalies and their classifications | ||
echo "Running generate_anomaly_output.py..." | ||
python3 classifications.py | ||
|
||
# Check if the first script was successful | ||
if [ $? -ne 0 ]; then | ||
echo "generate_anomaly_output.py failed." | ||
exit 1 | ||
fi | ||
|
||
# Run the second Python script to search for light curves | ||
echo "Running search_lightcurves.py..." | ||
python3 lightcurveCreate.py | ||
|
||
# Check if the second script was successful | ||
if [ $? -ne 0 ]; then | ||
echo "search_lightcurves.py failed." | ||
exit 1 | ||
fi | ||
|
||
./extractSectors.sh | ||
|
||
echo "All scripts ran successfully." |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,47 @@ | ||
OUTPUT_FILE="lightcurve_results.txt" | ||
GROUPED_OUTPUT_FILE="grouped_sectors.txt" | ||
|
||
# Check if the output file exists | ||
if [ ! -f "$OUTPUT_FILE" ]; then | ||
echo "Output file $OUTPUT_FILE not found." | ||
exit 1 | ||
fi | ||
|
||
# Extract unique sectors and format the output | ||
echo "Extracting unique sectors and formatting output..." | ||
{ | ||
# Initialize variables | ||
current_tic="" | ||
first_sector="" | ||
|
||
# Read the output file line by line | ||
while IFS= read -r line; do | ||
# Check for a TIC ID line | ||
if [[ $line == TIC* ]]; then | ||
# If we have a current TIC, print the sectors for it only if we found a sector | ||
if [ -n "$current_tic" ] && [ -n "$first_sector" ]; then | ||
echo "TIC $current_tic: $first_sector" >> "$GROUPED_OUTPUT_FILE" | ||
fi | ||
|
||
# Reset variables and update current TIC | ||
current_tic=$(echo "$line" | cut -d ' ' -f 2) | ||
first_sector="" # Reset the first sector for the new TIC | ||
fi | ||
|
||
# Check for sector lines and extract the first sector | ||
if [[ $line == *"TESS Sector"* ]]; then | ||
sector=$(echo "$line" | awk '{print $2, $3, $4}') # Extract relevant columns | ||
# If first_sector is not set, set it to the current sector | ||
if [ -z "$first_sector" ]; then | ||
first_sector="$sector" | ||
fi | ||
fi | ||
done < "$OUTPUT_FILE" | ||
|
||
# Print the last TIC if it exists and has a sector | ||
if [ -n "$current_tic" ] && [ -n "$first_sector" ]; then | ||
echo "TIC $current_tic: $first_sector" >> "$GROUPED_OUTPUT_FILE" | ||
fi | ||
} | ||
|
||
echo "Unique sectors extracted and formatted in $GROUPED_OUTPUT_FILE." |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,20 @@ | ||
TIC 50365310: TESS Sector 34 | ||
TIC 65212867: TESS Sector 34 | ||
TIC 88863718: TESS Sector 34 | ||
TIC 106997505: TESS Sector 34 | ||
TIC 124709665: TESS Sector 33 | ||
TIC 156115721: TESS Sector 34 | ||
TIC 169904935: TESS Sector 34 | ||
TIC 238597883: TESS Sector 34 | ||
TIC 440801822: TESS Sector 33 | ||
TIC 5: TESS Sector 38 | ||
TIC 277039287: TESS Sector 67 | ||
TIC 57299130: TESS Sector 24 | ||
TIC 21720215: TESS Sector 79 | ||
TIC 263723967: TESS Sector 25 | ||
TIC 284300833: TESS Sector 78 | ||
TIC 269343479: TESS Sector 61 | ||
TIC 345724317: TESS Sector 25 | ||
TIC 210904767: TESS Sector 12 | ||
TIC 329981856: TESS Sector 11 | ||
TIC 201175570: TESS Sector 69 |
Oops, something went wrong.