Merge remote-tracking branch 'upstream/develop' into streamImagesForAutoIngest

This commit is contained in:
apriestman 2020-07-29 14:48:01 -04:00
commit d833e27d01
117 changed files with 3071 additions and 1457 deletions

View File

@ -37,16 +37,16 @@ to the root 64-bit JRE directory.
2) Get Sleuth Kit Setup 2) Get Sleuth Kit Setup
2a) Download and build a Release version of Sleuth Kit (TSK) 4.0. See 2a) Download and build a Release version of Sleuth Kit (TSK) 4.0. See
win32\BUILDING.txt in the TSK package for more information. You need to win32\BUILDING.txt in the TSK package for more information. You need to
build the tsk_jni project. Select the Release_PostgreSQL Win32 or x64 target, build the tsk_jni project. Select the Release Win32 or x64 target,
depending upon your target build. You can use a released version or download depending upon your target build. You can use a released version or download
the latest from github: the latest from github:
- git://github.com/sleuthkit/sleuthkit.git - git://github.com/sleuthkit/sleuthkit.git
2b) Build the TSK JAR file by typing 'ant dist-PostgreSQL' in 2b) Build the TSK JAR file by typing 'ant dist' in
bindings/java in the bindings/java in the
TSK source code folder from a command line. Note it is case TSK source code folder from a command line. Note it is case
sensitive. You can also add the code to a NetBeans project and build sensitive. You can also add the code to a NetBeans project and build
it from there, selecting the dist-PostgreSQL target. it from there, selecting the dist target.
2c) Set TSK_HOME environment variable to the root directory of TSK 2c) Set TSK_HOME environment variable to the root directory of TSK
@ -103,7 +103,7 @@ the build process.
- The Sleuth Kit Java datamodel JAR file has native JNI libraries - The Sleuth Kit Java datamodel JAR file has native JNI libraries
that are copied into it. These JNI libraries have dependencies on that are copied into it. These JNI libraries have dependencies on
libewf, zlib, libpq, libintl-8, libeay32, and ssleay32 DLL files. On non-Windows libewf, zlib, libintl-8, libeay32, and ssleay32 DLL files. On non-Windows
platforms, the JNI library also has a dependency on libtsk (on Windows, platforms, the JNI library also has a dependency on libtsk (on Windows,
it is compiled into libtsk_jni). it is compiled into libtsk_jni).

View File

@ -59,6 +59,11 @@
<fileset dir="${thirdparty.dir}/InterestingFileSetRules"/> <fileset dir="${thirdparty.dir}/InterestingFileSetRules"/>
</copy> </copy>
<!--Copy OfficialHashSets to release-->
<copy todir="${basedir}/release/OfficialHashSets" >
<fileset dir="${thirdparty.dir}/OfficialHashSets"/>
</copy>
<!-- The 'libgstlibav.dll' file is too big to store on GitHub, so we <!-- The 'libgstlibav.dll' file is too big to store on GitHub, so we
have it stored in a ZIP file. We'll extract it in place and remove have it stored in a ZIP file. We'll extract it in place and remove
the ZIP file afterward. --> the ZIP file afterward. -->

View File

@ -83,7 +83,7 @@ file.reference.sevenzipjbinding.jar=release/modules/ext/sevenzipjbinding.jar
file.reference.sis-metadata-0.8.jar=release\\modules\\ext\\sis-metadata-0.8.jar file.reference.sis-metadata-0.8.jar=release\\modules\\ext\\sis-metadata-0.8.jar
file.reference.sis-netcdf-0.8.jar=release\\modules\\ext\\sis-netcdf-0.8.jar file.reference.sis-netcdf-0.8.jar=release\\modules\\ext\\sis-netcdf-0.8.jar
file.reference.sis-utility-0.8.jar=release\\modules\\ext\\sis-utility-0.8.jar file.reference.sis-utility-0.8.jar=release\\modules\\ext\\sis-utility-0.8.jar
file.reference.sleuthkit-caseuco-4.9.0.jar=release\\modules\\ext\\sleuthkit-caseuco-4.9.0.jar file.reference.sleuthkit-caseuco-4.10.0.jar=release/modules/ext/sleuthkit-caseuco-4.10.0.jar
file.reference.slf4j-api-1.7.25.jar=release\\modules\\ext\\slf4j-api-1.7.25.jar file.reference.slf4j-api-1.7.25.jar=release\\modules\\ext\\slf4j-api-1.7.25.jar
file.reference.sqlite-jdbc-3.25.2.jar=release/modules/ext/sqlite-jdbc-3.25.2.jar file.reference.sqlite-jdbc-3.25.2.jar=release/modules/ext/sqlite-jdbc-3.25.2.jar
file.reference.StixLib.jar=release/modules/ext/StixLib.jar file.reference.StixLib.jar=release/modules/ext/StixLib.jar
@ -91,7 +91,7 @@ file.reference.javax.ws.rs-api-2.0.1.jar=release/modules/ext/javax.ws.rs-api-2.0
file.reference.cxf-core-3.0.16.jar=release/modules/ext/cxf-core-3.0.16.jar file.reference.cxf-core-3.0.16.jar=release/modules/ext/cxf-core-3.0.16.jar
file.reference.cxf-rt-frontend-jaxrs-3.0.16.jar=release/modules/ext/cxf-rt-frontend-jaxrs-3.0.16.jar file.reference.cxf-rt-frontend-jaxrs-3.0.16.jar=release/modules/ext/cxf-rt-frontend-jaxrs-3.0.16.jar
file.reference.cxf-rt-transports-http-3.0.16.jar=release/modules/ext/cxf-rt-transports-http-3.0.16.jar file.reference.cxf-rt-transports-http-3.0.16.jar=release/modules/ext/cxf-rt-transports-http-3.0.16.jar
file.reference.sleuthkit-4.9.0.jar=release/modules/ext/sleuthkit-4.9.0.jar file.reference.sleuthkit-4.10.0.jar=release/modules/ext/sleuthkit-4.10.0.jar
file.reference.curator-client-2.8.0.jar=release/modules/ext/curator-client-2.8.0.jar file.reference.curator-client-2.8.0.jar=release/modules/ext/curator-client-2.8.0.jar
file.reference.curator-framework-2.8.0.jar=release/modules/ext/curator-framework-2.8.0.jar file.reference.curator-framework-2.8.0.jar=release/modules/ext/curator-framework-2.8.0.jar
file.reference.curator-recipes-2.8.0.jar=release/modules/ext/curator-recipes-2.8.0.jar file.reference.curator-recipes-2.8.0.jar=release/modules/ext/curator-recipes-2.8.0.jar

View File

@ -472,8 +472,8 @@
<binary-origin>release/modules/ext/commons-pool2-2.4.2.jar</binary-origin> <binary-origin>release/modules/ext/commons-pool2-2.4.2.jar</binary-origin>
</class-path-extension> </class-path-extension>
<class-path-extension> <class-path-extension>
<runtime-relative-path>ext/sleuthkit-4.9.0.jar</runtime-relative-path> <runtime-relative-path>ext/sleuthkit-4.10.0.jar</runtime-relative-path>
<binary-origin>release/modules/ext/sleuthkit-4.9.0.jar</binary-origin> <binary-origin>release/modules/ext/sleuthkit-4.10.0.jar</binary-origin>
</class-path-extension> </class-path-extension>
<class-path-extension> <class-path-extension>
<runtime-relative-path>ext/jxmapviewer2-2.4.jar</runtime-relative-path> <runtime-relative-path>ext/jxmapviewer2-2.4.jar</runtime-relative-path>
@ -780,8 +780,8 @@
<binary-origin>release/modules/ext/curator-client-2.8.0.jar</binary-origin> <binary-origin>release/modules/ext/curator-client-2.8.0.jar</binary-origin>
</class-path-extension> </class-path-extension>
<class-path-extension> <class-path-extension>
<runtime-relative-path>ext/sleuthkit-caseuco-4.9.0.jar</runtime-relative-path> <runtime-relative-path>ext/sleuthkit-caseuco-4.10.0.jar</runtime-relative-path>
<binary-origin>release\modules\ext\sleuthkit-caseuco-4.9.0.jar</binary-origin> <binary-origin>release/modules/ext/sleuthkit-caseuco-4.10.0.jar</binary-origin>
</class-path-extension> </class-path-extension>
<class-path-extension> <class-path-extension>
<runtime-relative-path>ext/fontbox-2.0.13.jar</runtime-relative-path> <runtime-relative-path>ext/fontbox-2.0.13.jar</runtime-relative-path>

View File

@ -29,6 +29,9 @@ import org.openide.util.Utilities;
import org.openide.windows.WindowManager; import org.openide.windows.WindowManager;
import org.sleuthkit.autopsy.casemodule.Case; import org.sleuthkit.autopsy.casemodule.Case;
import org.sleuthkit.autopsy.casemodule.NoCurrentCaseException; import org.sleuthkit.autopsy.casemodule.NoCurrentCaseException;
import org.sleuthkit.autopsy.casemodule.services.contentviewertags.ContentViewerTagManager;
import org.sleuthkit.autopsy.casemodule.services.contentviewertags.ContentViewerTagManager.ContentViewerTag;
import org.sleuthkit.autopsy.contentviewers.imagetagging.ImageTagRegion;
import org.sleuthkit.autopsy.coreutils.Logger; import org.sleuthkit.autopsy.coreutils.Logger;
import org.sleuthkit.datamodel.ContentTag; import org.sleuthkit.datamodel.ContentTag;
import org.sleuthkit.datamodel.TskCoreException; import org.sleuthkit.datamodel.TskCoreException;
@ -72,6 +75,12 @@ public class DeleteContentTagAction extends AbstractAction {
new Thread(() -> { new Thread(() -> {
for (ContentTag tag : selectedTags) { for (ContentTag tag : selectedTags) {
try { try {
// Check if there is an image tag before deleting the content tag.
ContentViewerTag<ImageTagRegion> imageTag = ContentViewerTagManager.getTag(tag, ImageTagRegion.class);
if(imageTag != null) {
ContentViewerTagManager.deleteTag(imageTag);
}
Case.getCurrentCaseThrows().getServices().getTagsManager().deleteContentTag(tag); Case.getCurrentCaseThrows().getServices().getTagsManager().deleteContentTag(tag);
} catch (TskCoreException | NoCurrentCaseException ex) { } catch (TskCoreException | NoCurrentCaseException ex) {
Logger.getLogger(DeleteContentTagAction.class.getName()).log(Level.SEVERE, "Error deleting tag", ex); //NON-NLS Logger.getLogger(DeleteContentTagAction.class.getName()).log(Level.SEVERE, "Error deleting tag", ex); //NON-NLS

View File

@ -39,6 +39,9 @@ import org.openide.util.actions.Presenter;
import org.sleuthkit.autopsy.casemodule.Case; import org.sleuthkit.autopsy.casemodule.Case;
import org.sleuthkit.autopsy.casemodule.NoCurrentCaseException; import org.sleuthkit.autopsy.casemodule.NoCurrentCaseException;
import org.sleuthkit.autopsy.casemodule.services.TagsManager; import org.sleuthkit.autopsy.casemodule.services.TagsManager;
import org.sleuthkit.autopsy.casemodule.services.contentviewertags.ContentViewerTagManager;
import org.sleuthkit.autopsy.casemodule.services.contentviewertags.ContentViewerTagManager.ContentViewerTag;
import org.sleuthkit.autopsy.contentviewers.imagetagging.ImageTagRegion;
import org.sleuthkit.autopsy.coreutils.Logger; import org.sleuthkit.autopsy.coreutils.Logger;
import org.sleuthkit.autopsy.tags.TagUtils; import org.sleuthkit.autopsy.tags.TagUtils;
import org.sleuthkit.datamodel.AbstractFile; import org.sleuthkit.datamodel.AbstractFile;
@ -123,6 +126,13 @@ public class DeleteFileContentTagAction extends AbstractAction implements Presen
try { try {
logger.log(Level.INFO, "Removing tag {0} from {1}", new Object[]{tagName.getDisplayName(), contentTag.getContent().getName()}); //NON-NLS logger.log(Level.INFO, "Removing tag {0} from {1}", new Object[]{tagName.getDisplayName(), contentTag.getContent().getName()}); //NON-NLS
// Check if there is an image tag before deleting the content tag.
ContentViewerTag<ImageTagRegion> imageTag = ContentViewerTagManager.getTag(contentTag, ImageTagRegion.class);
if(imageTag != null) {
ContentViewerTagManager.deleteTag(imageTag);
}
tagsManager.deleteContentTag(contentTag); tagsManager.deleteContentTag(contentTag);
} catch (TskCoreException tskCoreException) { } catch (TskCoreException tskCoreException) {
logger.log(Level.SEVERE, "Error untagging file", tskCoreException); //NON-NLS logger.log(Level.SEVERE, "Error untagging file", tskCoreException); //NON-NLS

View File

@ -29,6 +29,9 @@ import org.openide.util.Utilities;
import org.sleuthkit.autopsy.casemodule.Case; import org.sleuthkit.autopsy.casemodule.Case;
import org.sleuthkit.autopsy.casemodule.NoCurrentCaseException; import org.sleuthkit.autopsy.casemodule.NoCurrentCaseException;
import org.sleuthkit.autopsy.casemodule.services.TagsManager; import org.sleuthkit.autopsy.casemodule.services.TagsManager;
import org.sleuthkit.autopsy.casemodule.services.contentviewertags.ContentViewerTagManager;
import org.sleuthkit.autopsy.casemodule.services.contentviewertags.ContentViewerTagManager.ContentViewerTag;
import org.sleuthkit.autopsy.contentviewers.imagetagging.ImageTagRegion;
import org.sleuthkit.autopsy.coreutils.Logger; import org.sleuthkit.autopsy.coreutils.Logger;
import org.sleuthkit.datamodel.ContentTag; import org.sleuthkit.datamodel.ContentTag;
import org.sleuthkit.datamodel.TagName; import org.sleuthkit.datamodel.TagName;
@ -83,9 +86,19 @@ public final class ReplaceContentTagAction extends ReplaceTagAction<ContentTag>
try { try {
logger.log(Level.INFO, "Replacing tag {0} with tag {1} for artifact {2}", new Object[]{oldTag.getName().getDisplayName(), newTagName.getDisplayName(), oldTag.getContent().getName()}); //NON-NLS logger.log(Level.INFO, "Replacing tag {0} with tag {1} for artifact {2}", new Object[]{oldTag.getName().getDisplayName(), newTagName.getDisplayName(), oldTag.getContent().getName()}); //NON-NLS
tagsManager.deleteContentTag(oldTag); // Check if there is an image tag before deleting the content tag.
tagsManager.addContentTag(oldTag.getContent(), newTagName, newComment); ContentViewerTag<ImageTagRegion> imageTag = ContentViewerTagManager.getTag(oldTag, ImageTagRegion.class);
if(imageTag != null) {
ContentViewerTagManager.deleteTag(imageTag);
}
tagsManager.deleteContentTag(oldTag);
ContentTag newTag = tagsManager.addContentTag(oldTag.getContent(), newTagName, newComment);
// Resave the image tag if present.
if(imageTag != null) {
ContentViewerTagManager.saveTag(newTag, imageTag.getDetails());
}
} catch (TskCoreException tskCoreException) { } catch (TskCoreException tskCoreException) {
logger.log(Level.SEVERE, "Error replacing artifact tag", tskCoreException); //NON-NLS logger.log(Level.SEVERE, "Error replacing artifact tag", tskCoreException); //NON-NLS
Platform.runLater(() Platform.runLater(()

View File

@ -305,8 +305,40 @@ public final class CentralRepoAccount {
normalizedAccountIdentifier = accountIdentifier.toLowerCase().trim(); normalizedAccountIdentifier = accountIdentifier.toLowerCase().trim();
} }
} catch (CorrelationAttributeNormalizationException ex) { } catch (CorrelationAttributeNormalizationException ex) {
throw new InvalidAccountIDException("Failed to normalize the account idenitier.", ex); throw new InvalidAccountIDException("Failed to normalize the account idenitier " + accountIdentifier, ex);
} }
return normalizedAccountIdentifier; return normalizedAccountIdentifier;
} }
/**
* Normalizes an account identifier, based on the given account type.
*
* @param crAccountType Account type.
* @param accountIdentifier Account identifier to be normalized.
* @return Normalized identifier.
*
* @throws InvalidAccountIDException If the account identifier is invalid.
*/
public static String normalizeAccountIdentifier(CentralRepoAccountType crAccountType, String accountIdentifier) throws InvalidAccountIDException {
if (StringUtils.isBlank(accountIdentifier)) {
throw new InvalidAccountIDException("Account identifier is null or empty.");
}
String normalizedAccountIdentifier;
try {
if (crAccountType.getAcctType().equals(Account.Type.PHONE)) {
normalizedAccountIdentifier = CorrelationAttributeNormalizer.normalizePhone(accountIdentifier);
} else if (crAccountType.getAcctType().equals(Account.Type.EMAIL)) {
normalizedAccountIdentifier = CorrelationAttributeNormalizer.normalizeEmail(accountIdentifier);
} else {
// convert to lowercase
normalizedAccountIdentifier = accountIdentifier.toLowerCase();
}
} catch (CorrelationAttributeNormalizationException ex) {
throw new InvalidAccountIDException("Invalid account identifier", ex);
}
return normalizedAccountIdentifier;
}
} }

View File

@ -262,9 +262,7 @@ public class CentralRepoDbUtil {
* used * used
*/ */
public static void setUseCentralRepo(boolean centralRepoCheckBoxIsSelected) { public static void setUseCentralRepo(boolean centralRepoCheckBoxIsSelected) {
if (!centralRepoCheckBoxIsSelected) { closePersonasTopComponent();
closePersonasTopComponent();
}
ModuleSettings.setConfigSetting(CENTRAL_REPO_NAME, CENTRAL_REPO_USE_KEY, Boolean.toString(centralRepoCheckBoxIsSelected)); ModuleSettings.setConfigSetting(CENTRAL_REPO_NAME, CENTRAL_REPO_USE_KEY, Boolean.toString(centralRepoCheckBoxIsSelected));
} }

View File

@ -27,6 +27,7 @@ import org.sleuthkit.autopsy.casemodule.Case;
import org.sleuthkit.autopsy.centralrepository.datamodel.CentralRepoAccount.CentralRepoAccountType; import org.sleuthkit.autopsy.centralrepository.datamodel.CentralRepoAccount.CentralRepoAccountType;
import org.sleuthkit.autopsy.coordinationservice.CoordinationService; import org.sleuthkit.autopsy.coordinationservice.CoordinationService;
import org.sleuthkit.datamodel.HashHitInfo; import org.sleuthkit.datamodel.HashHitInfo;
import org.sleuthkit.datamodel.InvalidAccountIDException;
/** /**
* Main interface for interacting with the database * Main interface for interacting with the database
@ -881,8 +882,23 @@ public interface CentralRepository {
* @param accountUniqueID type specific unique account id * @param accountUniqueID type specific unique account id
* @return CR account * @return CR account
* *
* @throws CentralRepoException * @throws CentralRepoException If there is an error accessing Central Repository.
* @throws InvalidAccountIDException If the account identifier is not valid.
*/ */
CentralRepoAccount getOrCreateAccount(CentralRepoAccount.CentralRepoAccountType crAccountType, String accountUniqueID) throws CentralRepoException; CentralRepoAccount getOrCreateAccount(CentralRepoAccount.CentralRepoAccountType crAccountType, String accountUniqueID) throws InvalidAccountIDException, CentralRepoException;
/**
* Gets an account from the accounts table matching the given type/ID, if
* one exists.
*
* @param crAccountType CR account type to look for or create
* @param accountUniqueID type specific unique account id
*
* @return CR account, if found, null otherwise.
*
* @throws CentralRepoException If there is an error accessing Central Repository.
* @throws InvalidAccountIDException If the account identifier is not valid.
*/
CentralRepoAccount getAccount(CentralRepoAccount.CentralRepoAccountType crAccountType, String accountUniqueID) throws InvalidAccountIDException, CentralRepoException;
} }

View File

@ -35,6 +35,7 @@ import org.sleuthkit.datamodel.BlackboardArtifact.ARTIFACT_TYPE;
import org.sleuthkit.datamodel.BlackboardAttribute; import org.sleuthkit.datamodel.BlackboardAttribute;
import org.sleuthkit.datamodel.BlackboardAttribute.ATTRIBUTE_TYPE; import org.sleuthkit.datamodel.BlackboardAttribute.ATTRIBUTE_TYPE;
import org.sleuthkit.datamodel.HashUtility; import org.sleuthkit.datamodel.HashUtility;
import org.sleuthkit.datamodel.InvalidAccountIDException;
import org.sleuthkit.datamodel.TskCoreException; import org.sleuthkit.datamodel.TskCoreException;
import org.sleuthkit.datamodel.TskData; import org.sleuthkit.datamodel.TskData;
@ -184,7 +185,11 @@ public class CorrelationAttributeUtil {
} }
} }
} catch (CorrelationAttributeNormalizationException ex) { } catch (CorrelationAttributeNormalizationException ex) {
logger.log(Level.SEVERE, String.format("Error normalizing correlation attribute (%s)", artifact), ex); // NON-NLS logger.log(Level.WARNING, String.format("Error normalizing correlation attribute (%s)", artifact), ex); // NON-NLS
return correlationAttrs;
}
catch (InvalidAccountIDException ex) {
logger.log(Level.WARNING, String.format("Invalid account identifier (artifactID: %d)", artifact.getId())); // NON-NLS
return correlationAttrs; return correlationAttrs;
} }
catch (CentralRepoException ex) { catch (CentralRepoException ex) {
@ -281,7 +286,7 @@ public class CorrelationAttributeUtil {
* *
* @return The correlation attribute instance. * @return The correlation attribute instance.
*/ */
private static void makeCorrAttrFromAcctArtifact(List<CorrelationAttributeInstance> corrAttrInstances, BlackboardArtifact acctArtifact) throws TskCoreException, CentralRepoException { private static void makeCorrAttrFromAcctArtifact(List<CorrelationAttributeInstance> corrAttrInstances, BlackboardArtifact acctArtifact) throws InvalidAccountIDException, TskCoreException, CentralRepoException {
// Get the account type from the artifact // Get the account type from the artifact
BlackboardAttribute accountTypeAttribute = acctArtifact.getAttribute(new BlackboardAttribute.Type(BlackboardAttribute.ATTRIBUTE_TYPE.TSK_ACCOUNT_TYPE)); BlackboardAttribute accountTypeAttribute = acctArtifact.getAttribute(new BlackboardAttribute.Type(BlackboardAttribute.ATTRIBUTE_TYPE.TSK_ACCOUNT_TYPE));

View File

@ -52,6 +52,7 @@ import org.sleuthkit.autopsy.healthmonitor.TimingMetric;
import org.sleuthkit.datamodel.Account; import org.sleuthkit.datamodel.Account;
import org.sleuthkit.datamodel.CaseDbSchemaVersionNumber; import org.sleuthkit.datamodel.CaseDbSchemaVersionNumber;
import org.sleuthkit.datamodel.HashHitInfo; import org.sleuthkit.datamodel.HashHitInfo;
import org.sleuthkit.datamodel.InvalidAccountIDException;
import org.sleuthkit.datamodel.SleuthkitCase; import org.sleuthkit.datamodel.SleuthkitCase;
import org.sleuthkit.datamodel.TskData; import org.sleuthkit.datamodel.TskData;
@ -1080,21 +1081,34 @@ abstract class RdbmsCentralRepo implements CentralRepository {
* within TSK core * within TSK core
*/ */
@Override @Override
public CentralRepoAccount getOrCreateAccount(CentralRepoAccountType crAccountType, String accountUniqueID) throws CentralRepoException { public CentralRepoAccount getOrCreateAccount(CentralRepoAccountType crAccountType, String accountUniqueID) throws InvalidAccountIDException, CentralRepoException {
// Get the account fom the accounts table
String normalizedAccountID = CentralRepoAccount.normalizeAccountIdentifier(crAccountType, accountUniqueID);
// insert the account. If there is a conflict, ignore it.
String insertSQL;
switch (CentralRepoDbManager.getSavedDbChoice().getDbPlatform()) {
case POSTGRESQL:
insertSQL = "INSERT INTO accounts (account_type_id, account_unique_identifier) VALUES (?, ?) " + getConflictClause(); //NON-NLS
break;
case SQLITE:
insertSQL = "INSERT OR IGNORE INTO accounts (account_type_id, account_unique_identifier) VALUES (?, ?) "; //NON-NLS
break;
default:
throw new CentralRepoException(String.format("Cannot add account to currently selected CR database platform %s", CentralRepoDbManager.getSavedDbChoice().getDbPlatform())); //NON-NLS
}
String insertSQL = "INSERT INTO accounts (account_type_id, account_unique_identifier) "
+ "VALUES (?, ?) " + getConflictClause();
try (Connection connection = connect(); try (Connection connection = connect();
PreparedStatement preparedStatement = connection.prepareStatement(insertSQL);) { PreparedStatement preparedStatement = connection.prepareStatement(insertSQL);) {
preparedStatement.setInt(1, crAccountType.getAccountTypeId()); preparedStatement.setInt(1, crAccountType.getAccountTypeId());
preparedStatement.setString(2, accountUniqueID); // TBD: fill in the normalized ID preparedStatement.setString(2, normalizedAccountID);
preparedStatement.executeUpdate(); preparedStatement.executeUpdate();
// get the account from the db - should exist now. // get the account from the db - should exist now.
return getAccount(crAccountType, accountUniqueID); return getAccount(crAccountType, normalizedAccountID);
} catch (SQLException ex) { } catch (SQLException ex) {
throw new CentralRepoException("Error adding an account to CR database.", ex); throw new CentralRepoException("Error adding an account to CR database.", ex);
} }
@ -1177,15 +1191,17 @@ abstract class RdbmsCentralRepo implements CentralRepository {
* @return CentralRepoAccount for the give type/id. May return null if not * @return CentralRepoAccount for the give type/id. May return null if not
* found. * found.
* *
* @throws CentralRepoException * @throws CentralRepoException If there is an error accessing Central Repository.
* @throws InvalidAccountIDException If the account identifier is not valid.
*/ */
private CentralRepoAccount getAccount(CentralRepoAccountType crAccountType, String accountUniqueID) throws CentralRepoException { @Override
public CentralRepoAccount getAccount(CentralRepoAccountType crAccountType, String accountUniqueID) throws InvalidAccountIDException, CentralRepoException {
CentralRepoAccount crAccount = accountsCache.getIfPresent(Pair.of(crAccountType, accountUniqueID)); String normalizedAccountID = CentralRepoAccount.normalizeAccountIdentifier(crAccountType, accountUniqueID);
CentralRepoAccount crAccount = accountsCache.getIfPresent(Pair.of(crAccountType, normalizedAccountID));
if (crAccount == null) { if (crAccount == null) {
crAccount = getCRAccountFromDb(crAccountType, accountUniqueID); crAccount = getCRAccountFromDb(crAccountType, normalizedAccountID);
if (crAccount != null) { if (crAccount != null) {
accountsCache.put(Pair.of(crAccountType, accountUniqueID), crAccount); accountsCache.put(Pair.of(crAccountType, normalizedAccountID), crAccount);
} }
} }

View File

@ -43,6 +43,7 @@ import javax.swing.event.DocumentListener;
import javax.swing.filechooser.FileFilter; import javax.swing.filechooser.FileFilter;
import org.openide.util.NbBundle; import org.openide.util.NbBundle;
import org.openide.util.NbBundle.Messages; import org.openide.util.NbBundle.Messages;
import org.openide.windows.TopComponent;
import org.openide.windows.WindowManager; import org.openide.windows.WindowManager;
import org.sleuthkit.autopsy.centralrepository.datamodel.CentralRepoDbChoice; import org.sleuthkit.autopsy.centralrepository.datamodel.CentralRepoDbChoice;
import org.sleuthkit.autopsy.centralrepository.datamodel.CentralRepoDbManager; import org.sleuthkit.autopsy.centralrepository.datamodel.CentralRepoDbManager;
@ -660,6 +661,8 @@ public class EamDbSettingsDialog extends JDialog {
* found. * found.
*/ */
private static boolean testStatusAndCreate(Component parent, CentralRepoDbManager manager, EamDbSettingsDialog dialog) { private static boolean testStatusAndCreate(Component parent, CentralRepoDbManager manager, EamDbSettingsDialog dialog) {
closePersonasTopComponent();
parent.setCursor(Cursor.getPredefinedCursor(Cursor.WAIT_CURSOR)); parent.setCursor(Cursor.getPredefinedCursor(Cursor.WAIT_CURSOR));
manager.testStatus(); manager.testStatus();
@ -691,6 +694,21 @@ public class EamDbSettingsDialog extends JDialog {
return true; return true;
} }
/**
* Closes Personas top component if it exists.
*/
private static void closePersonasTopComponent() {
SwingUtilities.invokeLater(() -> {
TopComponent personasWindow = WindowManager.getDefault().findTopComponent("PersonasTopComponent");
if (personasWindow != null && personasWindow.isOpened()) {
personasWindow.close();
}
});
}
/** /**
* This method returns if changes to the central repository configuration * This method returns if changes to the central repository configuration
* were successfully applied. * were successfully applied.

View File

@ -6,6 +6,8 @@ AddMetadataDialog_empty_name_Title=Missing field(s)
CreatePersonaAccountDialog.title.text=Create Account CreatePersonaAccountDialog.title.text=Create Account
CreatePersonaAccountDialog_error_msg=Failed to create account. CreatePersonaAccountDialog_error_msg=Failed to create account.
CreatePersonaAccountDialog_error_title=Account failure CreatePersonaAccountDialog_error_title=Account failure
CreatePersonaAccountDialog_invalid_account_msg=Account identifier is not valid.
CreatePersonaAccountDialog_invalid_account_Title=Invalid account identifier
CreatePersonaAccountDialog_success_msg=Account added. CreatePersonaAccountDialog_success_msg=Account added.
CreatePersonaAccountDialog_success_title=Account added CreatePersonaAccountDialog_success_title=Account added
CTL_OpenPersonas=Personas CTL_OpenPersonas=Personas
@ -132,4 +134,4 @@ PersonasTopComponent_delete_exception_Title=Delete failure
PersonasTopComponent_Name=Personas PersonasTopComponent_Name=Personas
PersonasTopComponent_noCR_msg=Central Repository is not enabled. PersonasTopComponent_noCR_msg=Central Repository is not enabled.
PersonasTopComponent_search_exception_msg=Failed to search personas. PersonasTopComponent_search_exception_msg=Failed to search personas.
PersonasTopComponent_search_exception_Title=Search failure PersonasTopComponent_search_exception_Title=There was a failure during the search. Try opening a case to fully initialize the central repository database.

View File

@ -36,6 +36,7 @@ import org.sleuthkit.autopsy.centralrepository.datamodel.CentralRepoAccount.Cent
import org.sleuthkit.autopsy.centralrepository.datamodel.CentralRepoException; import org.sleuthkit.autopsy.centralrepository.datamodel.CentralRepoException;
import org.sleuthkit.autopsy.centralrepository.datamodel.CentralRepository; import org.sleuthkit.autopsy.centralrepository.datamodel.CentralRepository;
import org.sleuthkit.autopsy.coreutils.Logger; import org.sleuthkit.autopsy.coreutils.Logger;
import org.sleuthkit.datamodel.InvalidAccountIDException;
/** /**
* Configuration dialog for creating an account. * Configuration dialog for creating an account.
@ -216,7 +217,8 @@ public class CreatePersonaAccountDialog extends JDialog {
@Messages({ @Messages({
"CreatePersonaAccountDialog_error_title=Account failure", "CreatePersonaAccountDialog_error_title=Account failure",
"CreatePersonaAccountDialog_error_msg=Failed to create account.", "CreatePersonaAccountDialog_error_msg=Failed to create account.",
}) "CreatePersonaAccountDialog_invalid_account_Title=Invalid account identifier",
"CreatePersonaAccountDialog_invalid_account_msg=Account identifier is not valid.",})
private CentralRepoAccount createAccount(CentralRepoAccount.CentralRepoAccountType type, String identifier) { private CentralRepoAccount createAccount(CentralRepoAccount.CentralRepoAccountType type, String identifier) {
CentralRepoAccount ret = null; CentralRepoAccount ret = null;
try { try {
@ -227,8 +229,14 @@ public class CreatePersonaAccountDialog extends JDialog {
} catch (CentralRepoException e) { } catch (CentralRepoException e) {
logger.log(Level.SEVERE, "Failed to create account", e); logger.log(Level.SEVERE, "Failed to create account", e);
JOptionPane.showMessageDialog(this, JOptionPane.showMessageDialog(this,
Bundle.CreatePersonaAccountDialog_error_title(),
Bundle.CreatePersonaAccountDialog_error_msg(), Bundle.CreatePersonaAccountDialog_error_msg(),
Bundle.CreatePersonaAccountDialog_error_title(),
JOptionPane.ERROR_MESSAGE);
} catch (InvalidAccountIDException e) {
logger.log(Level.WARNING, "Invalid account identifier", e);
JOptionPane.showMessageDialog(this,
Bundle.CreatePersonaAccountDialog_invalid_account_msg(),
Bundle.CreatePersonaAccountDialog_invalid_account_Title(),
JOptionPane.ERROR_MESSAGE); JOptionPane.ERROR_MESSAGE);
} }
return ret; return ret;

View File

@ -20,6 +20,8 @@ package org.sleuthkit.autopsy.centralrepository.persona;
import java.awt.event.ActionEvent; import java.awt.event.ActionEvent;
import java.awt.event.ActionListener; import java.awt.event.ActionListener;
import java.awt.event.ComponentAdapter;
import java.awt.event.ComponentEvent;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.Collection; import java.util.Collection;
import java.util.List; import java.util.List;
@ -60,28 +62,6 @@ public final class PersonasTopComponent extends TopComponent {
private List<Persona> currentResults = null; private List<Persona> currentResults = null;
private Persona selectedPersona = null; private Persona selectedPersona = null;
/**
* Listens for when this component will be rendered and executes a search to
* update gui when it is displayed.
*/
private final AncestorListener onAddListener = new AncestorListener() {
@Override
public void ancestorAdded(AncestorEvent event) {
resetSearchControls();
setKeywordSearchEnabled(false, true);
}
@Override
public void ancestorRemoved(AncestorEvent event) {
//Empty
}
@Override
public void ancestorMoved(AncestorEvent event) {
//Empty
}
};
@Messages({ @Messages({
"PersonasTopComponent_Name=Personas", "PersonasTopComponent_Name=Personas",
"PersonasTopComponent_delete_exception_Title=Delete failure", "PersonasTopComponent_delete_exception_Title=Delete failure",
@ -165,7 +145,17 @@ public final class PersonasTopComponent extends TopComponent {
} }
}); });
addAncestorListener(onAddListener); /**
* Listens for when this component will be rendered and executes a
* search to update gui when it is displayed.
*/
addComponentListener(new ComponentAdapter() {
@Override
public void componentShown(ComponentEvent e) {
resetSearchControls();
setKeywordSearchEnabled(false, true);
}
});
} }
/** /**
@ -276,7 +266,7 @@ public final class PersonasTopComponent extends TopComponent {
} }
@Messages({ @Messages({
"PersonasTopComponent_search_exception_Title=Search failure", "PersonasTopComponent_search_exception_Title=There was a failure during the search. Try opening a case to fully initialize the central repository database.",
"PersonasTopComponent_search_exception_msg=Failed to search personas.", "PersonasTopComponent_search_exception_msg=Failed to search personas.",
"PersonasTopComponent_noCR_msg=Central Repository is not enabled.",}) "PersonasTopComponent_noCR_msg=Central Repository is not enabled.",})
private void executeSearch() { private void executeSearch() {

View File

@ -65,8 +65,10 @@ abstract class CVTFilterRefresher implements RefreshThrottler.Refresher {
try (SleuthkitCase.CaseDbQuery dbQuery = skCase.executeQuery("SELECT MAX(date_time) as end, MIN(date_time) as start from account_relationships")) { try (SleuthkitCase.CaseDbQuery dbQuery = skCase.executeQuery("SELECT MAX(date_time) as end, MIN(date_time) as start from account_relationships")) {
// ResultSet is closed by CasDBQuery // ResultSet is closed by CasDBQuery
ResultSet rs = dbQuery.getResultSet(); ResultSet rs = dbQuery.getResultSet();
rs.next();
startTime = rs.getInt("start"); // NON-NLS startTime = rs.getInt("start"); // NON-NLS
endTime = rs.getInt("end"); // NON-NLS endTime = rs.getInt("end"); // NON-NLS
} }
// Get the devices with CVT artifacts // Get the devices with CVT artifacts
List<Integer> deviceObjIds = new ArrayList<>(); List<Integer> deviceObjIds = new ArrayList<>();

View File

@ -269,6 +269,7 @@ final public class FiltersPanel extends JPanel {
* Populate the Account Types filter widgets. * Populate the Account Types filter widgets.
* *
* @param accountTypesInUse List of accountTypes currently in use * @param accountTypesInUse List of accountTypes currently in use
* @param checkNewOnes
* *
* @return True, if a new accountType was found * @return True, if a new accountType was found
*/ */
@ -314,9 +315,8 @@ final public class FiltersPanel extends JPanel {
/** /**
* Populate the devices filter widgets. * Populate the devices filter widgets.
* *
* @param selected Sets the initial state of device check box. * @param dataSourceMap
* @param sleuthkitCase The sleuthkit case for containing the data source * @param checkNewOnes
* information.
* *
* @return true if a new device was found * @return true if a new device was found
*/ */

View File

@ -9,7 +9,6 @@ SummaryViewer.callLogsLabel.text=Call Logs:
ThreadRootMessagePanel.showAllCheckBox.text=Show All Messages ThreadRootMessagePanel.showAllCheckBox.text=Show All Messages
ThreadPane.backButton.text=<--- ThreadPane.backButton.text=<---
SummaryViewer.caseReferencesPanel.border.title=Other Occurrences SummaryViewer.caseReferencesPanel.border.title=Other Occurrences
SummaryViewer.fileReferencesPanel.border.title=File References in Current Case
MessageViewer.threadsLabel.text=Select a Thread to View MessageViewer.threadsLabel.text=Select a Thread to View
MessageViewer.threadNameLabel.text=<threadName> MessageViewer.threadNameLabel.text=<threadName>
MessageViewer.showingMessagesLabel.text=Showing Messages for Thread: MessageViewer.showingMessagesLabel.text=Showing Messages for Thread:
@ -27,3 +26,5 @@ SummaryViewer.referencesLabel.text=Communication References:
SummaryViewer.referencesDataLabel.text=<reference count> SummaryViewer.referencesDataLabel.text=<reference count>
SummaryViewer.contactsLabel.text=Book Entries: SummaryViewer.contactsLabel.text=Book Entries:
SummaryViewer.accountCountry.text=<account country> SummaryViewer.accountCountry.text=<account country>
SummaryViewer.fileRefPane.border.title=File References in Current Case
SummaryViewer.selectAccountFileRefLabel.text=<Select a single account to see File References>

View File

@ -49,13 +49,13 @@ SummaryViewer_CentralRepository_Message=<Enable Central Respository to see Other
SummaryViewer_Country_Code=Country: SummaryViewer_Country_Code=Country:
SummaryViewer_Creation_Date_Title=Creation Date SummaryViewer_Creation_Date_Title=Creation Date
SummaryViewer_Device_Account_Description=This account was referenced by a device in the case. SummaryViewer_Device_Account_Description=This account was referenced by a device in the case.
SummaryViewer_Fetching_References=<Fetching File References>
SummaryViewer_FileRef_Message=<Select a single account to see File References> SummaryViewer_FileRef_Message=<Select a single account to see File References>
SummaryViewer_FileRefNameColumn_Title=Path SummaryViewer_FileRefNameColumn_Title=Path
SummaryViewer_TabTitle=Summary SummaryViewer_TabTitle=Summary
ThreadRootMessagePanel.showAllCheckBox.text=Show All Messages ThreadRootMessagePanel.showAllCheckBox.text=Show All Messages
ThreadPane.backButton.text=<--- ThreadPane.backButton.text=<---
SummaryViewer.caseReferencesPanel.border.title=Other Occurrences SummaryViewer.caseReferencesPanel.border.title=Other Occurrences
SummaryViewer.fileReferencesPanel.border.title=File References in Current Case
MessageViewer.threadsLabel.text=Select a Thread to View MessageViewer.threadsLabel.text=Select a Thread to View
MessageViewer.threadNameLabel.text=<threadName> MessageViewer.threadNameLabel.text=<threadName>
MessageViewer.showingMessagesLabel.text=Showing Messages for Thread: MessageViewer.showingMessagesLabel.text=Showing Messages for Thread:
@ -73,3 +73,5 @@ SummaryViewer.referencesLabel.text=Communication References:
SummaryViewer.referencesDataLabel.text=<reference count> SummaryViewer.referencesDataLabel.text=<reference count>
SummaryViewer.contactsLabel.text=Book Entries: SummaryViewer.contactsLabel.text=Book Entries:
SummaryViewer.accountCountry.text=<account country> SummaryViewer.accountCountry.text=<account country>
SummaryViewer.fileRefPane.border.title=File Referernce(s) in Current Case
SummaryViewer.selectAccountFileRefLabel.text=<Select a single account to see File References>

View File

@ -49,7 +49,6 @@ SummeryViewer_FileRef_Message=<\u30a2\u30ab\u30a6\u30f3\u30c8\u30921\u3064\u9078
ThreadRootMessagePanel.showAllCheckBox.text=\u3059\u3079\u3066\u306e\u30e1\u30c3\u30bb\u30fc\u30b8\u3092\u8868\u793a ThreadRootMessagePanel.showAllCheckBox.text=\u3059\u3079\u3066\u306e\u30e1\u30c3\u30bb\u30fc\u30b8\u3092\u8868\u793a
ThreadPane.backButton.text=<--- ThreadPane.backButton.text=<---
SummaryViewer.caseReferencesPanel.border.title=\u305d\u306e\u4ed6\u306e\u767a\u751f SummaryViewer.caseReferencesPanel.border.title=\u305d\u306e\u4ed6\u306e\u767a\u751f
SummaryViewer.fileReferencesPanel.border.title=\u73fe\u5728\u306e\u30b1\u30fc\u30b9\u306e\u30d5\u30a1\u30a4\u30eb\u30ec\u30d5\u30a1\u30ec\u30f3\u30b9
MessageViewer.threadsLabel.text=\u30b9\u30ec\u30c3\u30c9\u3092\u9078\u629e\u3057\u3066\u8868\u793a MessageViewer.threadsLabel.text=\u30b9\u30ec\u30c3\u30c9\u3092\u9078\u629e\u3057\u3066\u8868\u793a
MessageViewer.threadNameLabel.text=<threadName> MessageViewer.threadNameLabel.text=<threadName>
MessageViewer.showingMessagesLabel.text=\u6b21\u306e\u30b9\u30ec\u30c3\u30c9\u306e\u30e1\u30c3\u30bb\u30fc\u30b8\u3092\u8868\u793a\u4e2d\u3067\u3059: MessageViewer.showingMessagesLabel.text=\u6b21\u306e\u30b9\u30ec\u30c3\u30c9\u306e\u30e1\u30c3\u30bb\u30fc\u30b8\u3092\u8868\u793a\u4e2d\u3067\u3059:

View File

@ -18,7 +18,9 @@
*/ */
package org.sleuthkit.autopsy.communications.relationships; package org.sleuthkit.autopsy.communications.relationships;
import java.awt.event.ActionEvent;
import java.util.logging.Level; import java.util.logging.Level;
import javax.swing.AbstractAction;
import javax.swing.Action; import javax.swing.Action;
import org.apache.commons.lang3.StringUtils; import org.apache.commons.lang3.StringUtils;
import org.openide.nodes.Sheet; import org.openide.nodes.Sheet;
@ -55,6 +57,8 @@ class MessageNode extends BlackboardArtifactNode {
private final Action preferredAction; private final Action preferredAction;
private final Action defaultNoopAction = new DefaultMessageAction();
MessageNode(BlackboardArtifact artifact, String threadID, Action preferredAction) { MessageNode(BlackboardArtifact artifact, String threadID, Action preferredAction) {
super(artifact); super(artifact);
@ -148,7 +152,7 @@ class MessageNode extends BlackboardArtifactNode {
@Override @Override
public Action getPreferredAction() { public Action getPreferredAction() {
return preferredAction; return preferredAction != null ? preferredAction : defaultNoopAction;
} }
private int getAttachmentsCount() throws TskCoreException { private int getAttachmentsCount() throws TskCoreException {
@ -171,4 +175,17 @@ class MessageNode extends BlackboardArtifactNode {
return attachmentsCount; return attachmentsCount;
} }
/**
* A no op action to override the default action of BlackboardArtifactNode
*/
private class DefaultMessageAction extends AbstractAction {
private static final long serialVersionUID = 1L;
@Override
public void actionPerformed(ActionEvent e) {
// Do Nothing.
}
}
} }

View File

@ -258,8 +258,6 @@ final class MessageViewer extends JPanel implements RelationshipsViewer {
*/ */
private void showMessagesPane() { private void showMessagesPane() {
switchCard("messages"); switchCard("messages");
Outline outline = rootTablePane.getOutlineView().getOutline();
outline.clearSelection();
} }
/** /**

View File

@ -23,6 +23,8 @@ import java.awt.KeyboardFocusManager;
import java.beans.PropertyChangeEvent; import java.beans.PropertyChangeEvent;
import java.beans.PropertyChangeListener; import java.beans.PropertyChangeListener;
import static javax.swing.SwingUtilities.isDescendingFrom; import static javax.swing.SwingUtilities.isDescendingFrom;
import javax.swing.event.TableModelEvent;
import javax.swing.event.TableModelListener;
import org.netbeans.swing.outline.DefaultOutlineModel; import org.netbeans.swing.outline.DefaultOutlineModel;
import org.netbeans.swing.outline.Outline; import org.netbeans.swing.outline.Outline;
import org.openide.explorer.ExplorerManager; import org.openide.explorer.ExplorerManager;
@ -82,6 +84,18 @@ class MessagesPanel extends javax.swing.JPanel implements Lookup.Provider {
} else { } else {
messageContentViewer.setNode(null); messageContentViewer.setNode(null);
} }
}
});
// This is a trick to get the first message to be selected after the ChildFactory has added
// new data to the table.
outlineViewPanel.getOutlineView().getOutline().getOutlineModel().addTableModelListener(new TableModelListener() {
@Override
public void tableChanged(TableModelEvent e) {
if (e.getType() == TableModelEvent.INSERT) {
outline.setRowSelectionInterval(0, 0);
}
} }
}); });

View File

@ -253,22 +253,6 @@
</Component> </Component>
</SubComponents> </SubComponents>
</Container> </Container>
<Component class="org.sleuthkit.autopsy.communications.relationships.OutlineViewPanel" name="fileReferencesPanel">
<Properties>
<Property name="border" type="javax.swing.border.Border" editor="org.netbeans.modules.form.editors2.BorderEditor">
<Border info="org.netbeans.modules.form.compat2.border.TitledBorderInfo">
<TitledBorder title="File References in Current Case">
<ResourceString PropertyName="titleX" bundle="org/sleuthkit/autopsy/communications/relationships/Bundle.properties" key="SummaryViewer.fileReferencesPanel.border.title" replaceFormat="org.openide.util.NbBundle.getMessage({sourceFileName}.class, &quot;{key}&quot;)"/>
</TitledBorder>
</Border>
</Property>
</Properties>
<Constraints>
<Constraint layoutClass="org.netbeans.modules.form.compat2.layouts.DesignGridBagLayout" value="org.netbeans.modules.form.compat2.layouts.DesignGridBagLayout$GridBagConstraintsDescription">
<GridBagConstraints gridX="0" gridY="3" gridWidth="1" gridHeight="1" fill="1" ipadX="0" ipadY="0" insetsTop="9" insetsLeft="0" insetsBottom="0" insetsRight="0" anchor="18" weightX="1.0" weightY="1.0"/>
</Constraint>
</Constraints>
</Component>
<Component class="org.sleuthkit.autopsy.communications.relationships.OutlineViewPanel" name="caseReferencesPanel"> <Component class="org.sleuthkit.autopsy.communications.relationships.OutlineViewPanel" name="caseReferencesPanel">
<Properties> <Properties>
<Property name="border" type="javax.swing.border.Border" editor="org.netbeans.modules.form.editors2.BorderEditor"> <Property name="border" type="javax.swing.border.Border" editor="org.netbeans.modules.form.editors2.BorderEditor">
@ -285,5 +269,99 @@
</Constraint> </Constraint>
</Constraints> </Constraints>
</Component> </Component>
<Container class="javax.swing.JPanel" name="fileRefPane">
<Properties>
<Property name="border" type="javax.swing.border.Border" editor="org.netbeans.modules.form.editors2.BorderEditor">
<Border info="org.netbeans.modules.form.compat2.border.TitledBorderInfo">
<TitledBorder title="File References in Current Case">
<ResourceString PropertyName="titleX" bundle="org/sleuthkit/autopsy/communications/relationships/Bundle.properties" key="SummaryViewer.fileRefPane.border.title" replaceFormat="org.openide.util.NbBundle.getMessage({sourceFileName}.class, &quot;{key}&quot;)"/>
</TitledBorder>
</Border>
</Property>
</Properties>
<Constraints>
<Constraint layoutClass="org.netbeans.modules.form.compat2.layouts.DesignGridBagLayout" value="org.netbeans.modules.form.compat2.layouts.DesignGridBagLayout$GridBagConstraintsDescription">
<GridBagConstraints gridX="0" gridY="3" gridWidth="1" gridHeight="1" fill="1" ipadX="0" ipadY="0" insetsTop="0" insetsLeft="0" insetsBottom="0" insetsRight="0" anchor="18" weightX="0.0" weightY="1.0"/>
</Constraint>
</Constraints>
<Layout class="org.netbeans.modules.form.compat2.layouts.DesignCardLayout"/>
<SubComponents>
<Container class="javax.swing.JPanel" name="fileRefScrolPanel">
<AuxValues>
<AuxValue name="JavaCodeGenerator_VariableLocal" type="java.lang.Boolean" value="true"/>
<AuxValue name="JavaCodeGenerator_VariableModifier" type="java.lang.Integer" value="0"/>
</AuxValues>
<Constraints>
<Constraint layoutClass="org.netbeans.modules.form.compat2.layouts.DesignCardLayout" value="org.netbeans.modules.form.compat2.layouts.DesignCardLayout$CardConstraintsDescription">
<CardConstraints cardName="listPanelCard"/>
</Constraint>
</Constraints>
<Layout class="org.netbeans.modules.form.compat2.layouts.DesignBorderLayout"/>
<SubComponents>
<Container class="javax.swing.JScrollPane" name="scrollPane">
<AuxValues>
<AuxValue name="JavaCodeGenerator_VariableLocal" type="java.lang.Boolean" value="true"/>
<AuxValue name="JavaCodeGenerator_VariableModifier" type="java.lang.Integer" value="0"/>
</AuxValues>
<Constraints>
<Constraint layoutClass="org.netbeans.modules.form.compat2.layouts.DesignBorderLayout" value="org.netbeans.modules.form.compat2.layouts.DesignBorderLayout$BorderConstraintsDescription">
<BorderConstraints direction="Center"/>
</Constraint>
</Constraints>
<Layout class="org.netbeans.modules.form.compat2.layouts.support.JScrollPaneSupportLayout"/>
<SubComponents>
<Component class="javax.swing.JList" name="fileRefList">
<Properties>
<Property name="model" type="javax.swing.ListModel" editor="org.netbeans.modules.form.editors2.ListModelEditor">
<StringArray count="5">
<StringItem index="0" value="Item 1"/>
<StringItem index="1" value="Item 2"/>
<StringItem index="2" value="Item 3"/>
<StringItem index="3" value="Item 4"/>
<StringItem index="4" value="Item 5"/>
</StringArray>
</Property>
</Properties>
<AuxValues>
<AuxValue name="JavaCodeGenerator_TypeParameters" type="java.lang.String" value="&lt;String&gt;"/>
</AuxValues>
</Component>
</SubComponents>
</Container>
</SubComponents>
</Container>
<Container class="javax.swing.JPanel" name="selectAccountPane">
<AuxValues>
<AuxValue name="JavaCodeGenerator_VariableLocal" type="java.lang.Boolean" value="true"/>
<AuxValue name="JavaCodeGenerator_VariableModifier" type="java.lang.Integer" value="0"/>
</AuxValues>
<Constraints>
<Constraint layoutClass="org.netbeans.modules.form.compat2.layouts.DesignCardLayout" value="org.netbeans.modules.form.compat2.layouts.DesignCardLayout$CardConstraintsDescription">
<CardConstraints cardName="selectAccountCard"/>
</Constraint>
</Constraints>
<Layout class="org.netbeans.modules.form.compat2.layouts.DesignGridBagLayout"/>
<SubComponents>
<Component class="javax.swing.JLabel" name="selectAccountFileRefLabel">
<Properties>
<Property name="text" type="java.lang.String" editor="org.netbeans.modules.i18n.form.FormI18nStringEditor">
<ResourceString bundle="org/sleuthkit/autopsy/communications/relationships/Bundle.properties" key="SummaryViewer.selectAccountFileRefLabel.text" replaceFormat="org.openide.util.NbBundle.getMessage({sourceFileName}.class, &quot;{key}&quot;)"/>
</Property>
<Property name="enabled" type="boolean" value="false"/>
</Properties>
<Constraints>
<Constraint layoutClass="org.netbeans.modules.form.compat2.layouts.DesignGridBagLayout" value="org.netbeans.modules.form.compat2.layouts.DesignGridBagLayout$GridBagConstraintsDescription">
<GridBagConstraints gridX="-1" gridY="-1" gridWidth="1" gridHeight="1" fill="0" ipadX="0" ipadY="0" insetsTop="0" insetsLeft="0" insetsBottom="0" insetsRight="0" anchor="10" weightX="0.0" weightY="0.0"/>
</Constraint>
</Constraints>
</Component>
</SubComponents>
</Container>
</SubComponents>
</Container>
</SubComponents> </SubComponents>
</Form> </Form>

View File

@ -18,8 +18,14 @@
*/ */
package org.sleuthkit.autopsy.communications.relationships; package org.sleuthkit.autopsy.communications.relationships;
import java.util.Set; import java.awt.CardLayout;
import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.ExecutionException;
import java.util.logging.Level;
import javax.swing.DefaultListModel;
import javax.swing.JPanel; import javax.swing.JPanel;
import javax.swing.SwingWorker;
import org.netbeans.swing.outline.DefaultOutlineModel; import org.netbeans.swing.outline.DefaultOutlineModel;
import org.netbeans.swing.outline.Outline; import org.netbeans.swing.outline.Outline;
import org.openide.explorer.view.OutlineView; import org.openide.explorer.view.OutlineView;
@ -27,8 +33,11 @@ import org.openide.nodes.AbstractNode;
import org.openide.nodes.Children; import org.openide.nodes.Children;
import org.openide.util.Lookup; import org.openide.util.Lookup;
import org.openide.util.NbBundle.Messages; import org.openide.util.NbBundle.Messages;
import org.sleuthkit.autopsy.casemodule.Case;
import org.sleuthkit.datamodel.Account; import org.sleuthkit.datamodel.Account;
import org.sleuthkit.autopsy.centralrepository.datamodel.CentralRepository; import org.sleuthkit.autopsy.centralrepository.datamodel.CentralRepository;
import org.sleuthkit.autopsy.coreutils.Logger;
import org.sleuthkit.datamodel.AccountFileInstance;
/** /**
* Account Summary View Panel. This panel shows a list of various counts related * Account Summary View Panel. This panel shows a list of various counts related
@ -39,6 +48,9 @@ import org.sleuthkit.autopsy.centralrepository.datamodel.CentralRepository;
public class SummaryViewer extends javax.swing.JPanel implements RelationshipsViewer { public class SummaryViewer extends javax.swing.JPanel implements RelationshipsViewer {
private final Lookup lookup; private final Lookup lookup;
private final DefaultListModel<String> fileRefListModel;
private static final Logger logger = Logger.getLogger(SummaryViewer.class.getName());
@Messages({ @Messages({
"SummaryViewer_TabTitle=Summary", "SummaryViewer_TabTitle=Summary",
@ -60,14 +72,11 @@ public class SummaryViewer extends javax.swing.JPanel implements RelationshipsVi
lookup = Lookup.getDefault(); lookup = Lookup.getDefault();
initComponents(); initComponents();
OutlineView outlineView = fileReferencesPanel.getOutlineView(); fileRefListModel = new DefaultListModel<>();
fileRefList.setModel(fileRefListModel);
OutlineView outlineView = caseReferencesPanel.getOutlineView();
Outline outline = outlineView.getOutline(); Outline outline = outlineView.getOutline();
outline.setRootVisible(false);
((DefaultOutlineModel) outline.getOutlineModel()).setNodesColumnLabel(Bundle.SummaryViewer_FileRefNameColumn_Title());
outlineView = caseReferencesPanel.getOutlineView();
outline = outlineView.getOutline();
outlineView.setPropertyColumns("creationDate", Bundle.SummaryViewer_Creation_Date_Title()); //NON-NLS outlineView.setPropertyColumns("creationDate", Bundle.SummaryViewer_Creation_Date_Title()); //NON-NLS
outline.setRootVisible(false); outline.setRootVisible(false);
@ -76,7 +85,6 @@ public class SummaryViewer extends javax.swing.JPanel implements RelationshipsVi
clearControls(); clearControls();
caseReferencesPanel.hideOutlineView(Bundle.SummaryViewer_CentralRepository_Message()); caseReferencesPanel.hideOutlineView(Bundle.SummaryViewer_CentralRepository_Message());
fileReferencesPanel.hideOutlineView(Bundle.SummaryViewer_FileRef_Message());
} }
@Override @Override
@ -98,15 +106,20 @@ public class SummaryViewer extends javax.swing.JPanel implements RelationshipsVi
caseReferencesPanel.showOutlineView(); caseReferencesPanel.showOutlineView();
} }
CardLayout cardLayout = (CardLayout) fileRefPane.getLayout();
cardLayout.show(fileRefPane, "selectAccountCard");
fileRefListModel.removeAllElements();
// Request is that the SummaryViewer only show information if one // Request is that the SummaryViewer only show information if one
// account is selected // account is selected
if (info.getAccounts().size() != 1) { if (info == null || info.getAccounts().size() != 1) {
setEnabled(false); setEnabled(false);
clearControls(); clearControls();
accoutDescriptionLabel.setText(Bundle.SummaryViewer_Account_Description_MuliSelect()); accoutDescriptionLabel.setText(Bundle.SummaryViewer_Account_Description_MuliSelect());
selectAccountFileRefLabel.setText(Bundle.SummaryViewer_FileRef_Message());
fileReferencesPanel.hideOutlineView(Bundle.SummaryViewer_FileRef_Message());
} else { } else {
Account[] accountArray = info.getAccounts().toArray(new Account[1]); Account[] accountArray = info.getAccounts().toArray(new Account[1]);
Account account = accountArray[0]; Account account = accountArray[0];
@ -138,11 +151,10 @@ public class SummaryViewer extends javax.swing.JPanel implements RelationshipsVi
referencesDataLabel.setText(Integer.toString(summaryDetails.getReferenceCnt())); referencesDataLabel.setText(Integer.toString(summaryDetails.getReferenceCnt()));
contactsDataLabel.setText(Integer.toString(summaryDetails.getContactsCnt())); contactsDataLabel.setText(Integer.toString(summaryDetails.getContactsCnt()));
fileReferencesPanel.showOutlineView();
fileReferencesPanel.setNode(new AbstractNode(Children.create(new AccountSourceContentChildNodeFactory(info.getAccounts()), true)));
caseReferencesPanel.setNode(new AbstractNode(Children.create(new CorrelationCaseChildNodeFactory(info.getAccounts()), true))); caseReferencesPanel.setNode(new AbstractNode(Children.create(new CorrelationCaseChildNodeFactory(info.getAccounts()), true)));
updateFileReferences(account);
setEnabled(true); setEnabled(true);
} }
} }
@ -165,7 +177,7 @@ public class SummaryViewer extends javax.swing.JPanel implements RelationshipsVi
contactsLabel.setEnabled(enabled); contactsLabel.setEnabled(enabled);
messagesLabel.setEnabled(enabled); messagesLabel.setEnabled(enabled);
caseReferencesPanel.setEnabled(enabled); caseReferencesPanel.setEnabled(enabled);
fileReferencesPanel.setEnabled(enabled); fileRefList.setEnabled(enabled);
countsPanel.setEnabled(enabled); countsPanel.setEnabled(enabled);
attachmentsLabel.setEnabled(enabled); attachmentsLabel.setEnabled(enabled);
referencesLabel.setEnabled(enabled); referencesLabel.setEnabled(enabled);
@ -185,28 +197,45 @@ public class SummaryViewer extends javax.swing.JPanel implements RelationshipsVi
referencesDataLabel.setText(""); referencesDataLabel.setText("");
accountCountry.setText(""); accountCountry.setText("");
fileReferencesPanel.setNode(new AbstractNode(Children.LEAF)); fileRefListModel.clear();
caseReferencesPanel.setNode(new AbstractNode(Children.LEAF)); caseReferencesPanel.setNode(new AbstractNode(Children.LEAF));
} }
/** @Messages({
* For the given accounts create a comma separated string of all of the "SummaryViewer_Fetching_References=<Fetching File References>"
* names (TypeSpecificID). })
* private void updateFileReferences(final Account account) {
* @param accounts Set of selected accounts SwingWorker<List<String>, Void> worker = new SwingWorker<List<String>, Void>() {
* @Override
* @return String listing the account names protected List<String> doInBackground() throws Exception {
*/ List<String> stringList = new ArrayList<>();
private String createAccountLabel(Set<Account> accounts) { List<AccountFileInstance> accountFileInstanceList = Case.getCurrentCase().getSleuthkitCase().getCommunicationsManager().getAccountFileInstances(account);
StringBuilder buffer = new StringBuilder(); for (AccountFileInstance instance : accountFileInstanceList) {
accounts.stream().map((account) -> { stringList.add(instance.getFile().getUniquePath());
buffer.append(account.getTypeSpecificID()); }
return account; return stringList;
}).forEachOrdered((_item) -> { }
buffer.append(", ");
});
return buffer.toString().substring(0, buffer.length() - 2); @Override
protected void done() {
try {
List<String> fileRefList = get();
fileRefList.forEach(value -> {
fileRefListModel.addElement(value);
});
CardLayout cardLayout = (CardLayout) fileRefPane.getLayout();
cardLayout.show(fileRefPane, "listPanelCard");
} catch (InterruptedException | ExecutionException ex) {
logger.log(Level.WARNING, String.format(("Failed to get file references for account: %d"), account.getAccountID()), ex);
}
}
};
selectAccountFileRefLabel.setText(Bundle.SummaryViewer_Fetching_References());
worker.execute();
} }
/** /**
@ -237,8 +266,13 @@ public class SummaryViewer extends javax.swing.JPanel implements RelationshipsVi
contactsDataLabel = new javax.swing.JLabel(); contactsDataLabel = new javax.swing.JLabel();
referencesLabel = new javax.swing.JLabel(); referencesLabel = new javax.swing.JLabel();
referencesDataLabel = new javax.swing.JLabel(); referencesDataLabel = new javax.swing.JLabel();
fileReferencesPanel = new org.sleuthkit.autopsy.communications.relationships.OutlineViewPanel();
caseReferencesPanel = new org.sleuthkit.autopsy.communications.relationships.OutlineViewPanel(); caseReferencesPanel = new org.sleuthkit.autopsy.communications.relationships.OutlineViewPanel();
fileRefPane = new javax.swing.JPanel();
javax.swing.JPanel fileRefScrolPanel = new javax.swing.JPanel();
javax.swing.JScrollPane scrollPane = new javax.swing.JScrollPane();
fileRefList = new javax.swing.JList<>();
javax.swing.JPanel selectAccountPane = new javax.swing.JPanel();
selectAccountFileRefLabel = new javax.swing.JLabel();
setLayout(new java.awt.GridBagLayout()); setLayout(new java.awt.GridBagLayout());
@ -393,17 +427,6 @@ public class SummaryViewer extends javax.swing.JPanel implements RelationshipsVi
gridBagConstraints.anchor = java.awt.GridBagConstraints.NORTHWEST; gridBagConstraints.anchor = java.awt.GridBagConstraints.NORTHWEST;
add(contanctsPanel, gridBagConstraints); add(contanctsPanel, gridBagConstraints);
fileReferencesPanel.setBorder(javax.swing.BorderFactory.createTitledBorder(org.openide.util.NbBundle.getMessage(SummaryViewer.class, "SummaryViewer.fileReferencesPanel.border.title"))); // NOI18N
gridBagConstraints = new java.awt.GridBagConstraints();
gridBagConstraints.gridx = 0;
gridBagConstraints.gridy = 3;
gridBagConstraints.fill = java.awt.GridBagConstraints.BOTH;
gridBagConstraints.anchor = java.awt.GridBagConstraints.NORTHWEST;
gridBagConstraints.weightx = 1.0;
gridBagConstraints.weighty = 1.0;
gridBagConstraints.insets = new java.awt.Insets(9, 0, 0, 0);
add(fileReferencesPanel, gridBagConstraints);
caseReferencesPanel.setBorder(javax.swing.BorderFactory.createTitledBorder(org.openide.util.NbBundle.getMessage(SummaryViewer.class, "SummaryViewer.caseReferencesPanel.border.title"))); // NOI18N caseReferencesPanel.setBorder(javax.swing.BorderFactory.createTitledBorder(org.openide.util.NbBundle.getMessage(SummaryViewer.class, "SummaryViewer.caseReferencesPanel.border.title"))); // NOI18N
gridBagConstraints = new java.awt.GridBagConstraints(); gridBagConstraints = new java.awt.GridBagConstraints();
gridBagConstraints.gridx = 0; gridBagConstraints.gridx = 0;
@ -414,6 +437,38 @@ public class SummaryViewer extends javax.swing.JPanel implements RelationshipsVi
gridBagConstraints.weighty = 1.0; gridBagConstraints.weighty = 1.0;
gridBagConstraints.insets = new java.awt.Insets(9, 0, 0, 0); gridBagConstraints.insets = new java.awt.Insets(9, 0, 0, 0);
add(caseReferencesPanel, gridBagConstraints); add(caseReferencesPanel, gridBagConstraints);
fileRefPane.setBorder(javax.swing.BorderFactory.createTitledBorder(org.openide.util.NbBundle.getMessage(SummaryViewer.class, "SummaryViewer.fileRefPane.border.title"))); // NOI18N
fileRefPane.setLayout(new java.awt.CardLayout());
fileRefScrolPanel.setLayout(new java.awt.BorderLayout());
fileRefList.setModel(new javax.swing.AbstractListModel<String>() {
String[] strings = { "Item 1", "Item 2", "Item 3", "Item 4", "Item 5" };
public int getSize() { return strings.length; }
public String getElementAt(int i) { return strings[i]; }
});
scrollPane.setViewportView(fileRefList);
fileRefScrolPanel.add(scrollPane, java.awt.BorderLayout.CENTER);
fileRefPane.add(fileRefScrolPanel, "listPanelCard");
selectAccountPane.setLayout(new java.awt.GridBagLayout());
org.openide.awt.Mnemonics.setLocalizedText(selectAccountFileRefLabel, org.openide.util.NbBundle.getMessage(SummaryViewer.class, "SummaryViewer.selectAccountFileRefLabel.text")); // NOI18N
selectAccountFileRefLabel.setEnabled(false);
selectAccountPane.add(selectAccountFileRefLabel, new java.awt.GridBagConstraints());
fileRefPane.add(selectAccountPane, "selectAccountCard");
gridBagConstraints = new java.awt.GridBagConstraints();
gridBagConstraints.gridx = 0;
gridBagConstraints.gridy = 3;
gridBagConstraints.fill = java.awt.GridBagConstraints.BOTH;
gridBagConstraints.anchor = java.awt.GridBagConstraints.NORTHWEST;
gridBagConstraints.weighty = 1.0;
add(fileRefPane, gridBagConstraints);
}// </editor-fold>//GEN-END:initComponents }// </editor-fold>//GEN-END:initComponents
@ -430,11 +485,13 @@ public class SummaryViewer extends javax.swing.JPanel implements RelationshipsVi
private javax.swing.JLabel contactsLabel; private javax.swing.JLabel contactsLabel;
private javax.swing.JPanel contanctsPanel; private javax.swing.JPanel contanctsPanel;
private javax.swing.JPanel countsPanel; private javax.swing.JPanel countsPanel;
private org.sleuthkit.autopsy.communications.relationships.OutlineViewPanel fileReferencesPanel; private javax.swing.JList<String> fileRefList;
private javax.swing.JPanel fileRefPane;
private javax.swing.JLabel messagesDataLabel; private javax.swing.JLabel messagesDataLabel;
private javax.swing.JLabel messagesLabel; private javax.swing.JLabel messagesLabel;
private javax.swing.JLabel referencesDataLabel; private javax.swing.JLabel referencesDataLabel;
private javax.swing.JLabel referencesLabel; private javax.swing.JLabel referencesLabel;
private javax.swing.JLabel selectAccountFileRefLabel;
private javax.swing.JPanel summaryPanel; private javax.swing.JPanel summaryPanel;
private javax.swing.JLabel thumbnailCntLabel; private javax.swing.JLabel thumbnailCntLabel;
private javax.swing.JLabel thumbnailsDataLabel; private javax.swing.JLabel thumbnailsDataLabel;

View File

@ -633,21 +633,18 @@ public class ContactArtifactViewer extends javax.swing.JPanel implements Artifac
return new HashMap<>(); return new HashMap<>();
} }
// make a list of all unique accounts for this contact
if (!account.getAccountType().equals(Account.Type.DEVICE)) {
CentralRepoAccount.CentralRepoAccountType crAccountType = CentralRepository.getInstance().getAccountTypeByName(account.getAccountType().getTypeName());
CentralRepoAccount crAccount = CentralRepository.getInstance().getAccount(crAccountType, account.getTypeSpecificID());
if (crAccount != null && uniqueAccountsList.contains(crAccount) == false) {
uniqueAccountsList.add(crAccount);
}
}
Collection<PersonaAccount> personaAccounts = PersonaAccount.getPersonaAccountsForAccount(account); Collection<PersonaAccount> personaAccounts = PersonaAccount.getPersonaAccountsForAccount(account);
if (personaAccounts != null && !personaAccounts.isEmpty()) { if (personaAccounts != null && !personaAccounts.isEmpty()) {
// look for unique accounts
Collection<CentralRepoAccount> accountCandidates
= personaAccounts
.stream()
.map(PersonaAccount::getAccount)
.collect(Collectors.toList());
for (CentralRepoAccount crAccount : accountCandidates) {
if (uniqueAccountsList.contains(crAccount) == false) {
uniqueAccountsList.add(crAccount);
}
}
// get personas for the account // get personas for the account
Collection<Persona> personas Collection<Persona> personas
= personaAccounts = personaAccounts

View File

@ -290,6 +290,8 @@ public final class ContextViewer extends javax.swing.JPanel implements DataConte
contextContainer.add(usagePanel); contextContainer.add(usagePanel);
} }
} }
contextContainer.setBackground(javax.swing.UIManager.getDefaults().getColor("window"));
contextContainer.setEnabled(foundASource); contextContainer.setEnabled(foundASource);
contextContainer.setVisible(foundASource); contextContainer.setVisible(foundASource);
jScrollPane.getViewport().setView(contextContainer); jScrollPane.getViewport().setView(contextContainer);

View File

@ -87,12 +87,28 @@ public class EncodingUtils {
try (InputStream stream = new BufferedInputStream(new ReadContentInputStream(file))) { try (InputStream stream = new BufferedInputStream(new ReadContentInputStream(file))) {
CharsetDetector detector = new CharsetDetector(); CharsetDetector detector = new CharsetDetector();
detector.setText(stream); detector.setText(stream);
CharsetMatch tikaResult = detector.detect();
if (tikaResult != null && tikaResult.getConfidence() >= MIN_CHARSETDETECT_MATCH_CONFIDENCE) { CharsetMatch[] tikaResults = detector.detectAll();
String tikaCharSet = tikaResult.getName(); // Get all guesses by Tika. These matches are ordered
//Check if the nio package has support for the charset determined by Tika. // by descending confidence (largest first).
if(Charset.isSupported(tikaCharSet)) { if (tikaResults.length > 0) {
return Charset.forName(tikaCharSet); CharsetMatch topPick = tikaResults[0];
if (topPick.getName().equalsIgnoreCase("IBM500") && tikaResults.length > 1) {
// Legacy encoding, let's discard this one in favor
// of the second pick. Tika has some problems with
// mistakenly identifying text as IBM500. See JIRA-6600
// and https://issues.apache.org/jira/browse/TIKA-2771 for
// more details.
topPick = tikaResults[1];
}
if (!topPick.getName().equalsIgnoreCase("IBM500") &&
topPick.getConfidence() >= MIN_CHARSETDETECT_MATCH_CONFIDENCE &&
Charset.isSupported(topPick.getName())) {
// Choose this charset since it's supported and has high
// enough confidence
return Charset.forName(topPick.getName());
} }
} }
} }

View File

@ -186,7 +186,7 @@ FileSorter.SortingMethod.filetype.displayName=File Type
FileSorter.SortingMethod.frequency.displayName=Central Repo Frequency FileSorter.SortingMethod.frequency.displayName=Central Repo Frequency
FileSorter.SortingMethod.fullPath.displayName=Full Path FileSorter.SortingMethod.fullPath.displayName=Full Path
FileSorter.SortingMethod.keywordlist.displayName=Keyword List Names FileSorter.SortingMethod.keywordlist.displayName=Keyword List Names
GroupsListPanel.noResults.message.text=No results were found for the selected filters. GroupsListPanel.noResults.message.text=No results were found for the selected filters.\n\nReminder:\n -The File Type Identification module must be run on each data source you want to find results in.\n -The Hash Lookup module must be run on each data source if you want to filter by past occurrence.\n -The Exif module must be run on each data source if you are filtering by User Created content.
GroupsListPanel.noResults.title.text=No results found GroupsListPanel.noResults.title.text=No results found
ImageThumbnailPanel.isDeleted.text=All instances of file are deleted. ImageThumbnailPanel.isDeleted.text=All instances of file are deleted.
# {0} - otherInstanceCount # {0} - otherInstanceCount

View File

@ -23,6 +23,7 @@ import java.awt.Color;
import java.beans.PropertyChangeEvent; import java.beans.PropertyChangeEvent;
import java.beans.PropertyChangeListener; import java.beans.PropertyChangeListener;
import java.util.EnumSet; import java.util.EnumSet;
import java.util.HashSet;
import java.util.List; import java.util.List;
import java.util.Set; import java.util.Set;
import java.util.logging.Level; import java.util.logging.Level;
@ -30,12 +31,21 @@ import org.apache.commons.lang.StringUtils;
import org.openide.util.NbBundle.Messages; import org.openide.util.NbBundle.Messages;
import org.openide.windows.WindowManager; import org.openide.windows.WindowManager;
import org.sleuthkit.autopsy.casemodule.Case; import org.sleuthkit.autopsy.casemodule.Case;
import org.sleuthkit.autopsy.casemodule.NoCurrentCaseException;
import org.sleuthkit.autopsy.centralrepository.datamodel.CentralRepoException; import org.sleuthkit.autopsy.centralrepository.datamodel.CentralRepoException;
import org.sleuthkit.autopsy.centralrepository.datamodel.CentralRepository; import org.sleuthkit.autopsy.centralrepository.datamodel.CentralRepository;
import org.sleuthkit.autopsy.coreutils.Logger; import org.sleuthkit.autopsy.coreutils.Logger;
import org.sleuthkit.autopsy.discovery.FileGroup.GroupSortingAlgorithm; import org.sleuthkit.autopsy.discovery.FileGroup.GroupSortingAlgorithm;
import static org.sleuthkit.autopsy.discovery.FileGroup.GroupSortingAlgorithm.BY_GROUP_SIZE;
import org.sleuthkit.autopsy.discovery.FileSearch.GroupingAttributeType; import org.sleuthkit.autopsy.discovery.FileSearch.GroupingAttributeType;
import static org.sleuthkit.autopsy.discovery.FileSearch.GroupingAttributeType.PARENT_PATH;
import org.sleuthkit.autopsy.discovery.FileSorter.SortingMethod; import org.sleuthkit.autopsy.discovery.FileSorter.SortingMethod;
import org.sleuthkit.autopsy.ingest.IngestManager;
import org.sleuthkit.autopsy.ingest.ModuleDataEvent;
import org.sleuthkit.datamodel.BlackboardArtifact;
import org.sleuthkit.datamodel.BlackboardAttribute;
import org.sleuthkit.datamodel.TskCoreException;
import static org.sleuthkit.autopsy.discovery.FileSorter.SortingMethod.BY_FILE_NAME;
/** /**
* Dialog for displaying the controls and filters for configuration of a * Dialog for displaying the controls and filters for configuration of a
@ -45,6 +55,7 @@ final class DiscoveryDialog extends javax.swing.JDialog {
private static final Set<Case.Events> CASE_EVENTS_OF_INTEREST = EnumSet.of(Case.Events.CURRENT_CASE, private static final Set<Case.Events> CASE_EVENTS_OF_INTEREST = EnumSet.of(Case.Events.CURRENT_CASE,
Case.Events.DATA_SOURCE_ADDED, Case.Events.DATA_SOURCE_DELETED); Case.Events.DATA_SOURCE_ADDED, Case.Events.DATA_SOURCE_DELETED);
private static final Set<IngestManager.IngestModuleEvent> INGEST_MODULE_EVENTS_OF_INTEREST = EnumSet.of(IngestManager.IngestModuleEvent.DATA_ADDED);
private static final long serialVersionUID = 1L; private static final long serialVersionUID = 1L;
private final static Logger logger = Logger.getLogger(DiscoveryDialog.class.getName()); private final static Logger logger = Logger.getLogger(DiscoveryDialog.class.getName());
private ImageFilterPanel imageFilterPanel = null; private ImageFilterPanel imageFilterPanel = null;
@ -54,8 +65,12 @@ final class DiscoveryDialog extends javax.swing.JDialog {
private static final Color UNSELECTED_COLOR = new Color(240, 240, 240); private static final Color UNSELECTED_COLOR = new Color(240, 240, 240);
private SearchWorker searchWorker = null; private SearchWorker searchWorker = null;
private static DiscoveryDialog discDialog; private static DiscoveryDialog discDialog;
private static volatile boolean shouldUpdate = false;
private FileSearchData.FileType fileType = FileSearchData.FileType.IMAGE; private FileSearchData.FileType fileType = FileSearchData.FileType.IMAGE;
private final PropertyChangeListener listener; private final PropertyChangeListener listener;
private final Set<BlackboardAttribute> objectsDetected = new HashSet<>();
private final Set<BlackboardAttribute> interestingItems = new HashSet<>();
private final Set<BlackboardAttribute> hashSets = new HashSet<>();
/** /**
* Get the Discovery dialog instance. * Get the Discovery dialog instance.
@ -66,6 +81,10 @@ final class DiscoveryDialog extends javax.swing.JDialog {
if (discDialog == null) { if (discDialog == null) {
discDialog = new DiscoveryDialog(); discDialog = new DiscoveryDialog();
} }
if (shouldUpdate) {
discDialog.updateSearchSettings();
shouldUpdate = false;
}
return discDialog; return discDialog;
} }
@ -89,6 +108,7 @@ final class DiscoveryDialog extends javax.swing.JDialog {
} }
updateSearchSettings(); updateSearchSettings();
Case.addEventTypeSubscriber(CASE_EVENTS_OF_INTEREST, this.new CasePropertyChangeListener()); Case.addEventTypeSubscriber(CASE_EVENTS_OF_INTEREST, this.new CasePropertyChangeListener());
IngestManager.getInstance().addIngestModuleEventListener(INGEST_MODULE_EVENTS_OF_INTEREST, this.new ModuleChangeListener());
} }
/** /**
@ -116,6 +136,7 @@ final class DiscoveryDialog extends javax.swing.JDialog {
add(imageFilterPanel, CENTER); add(imageFilterPanel, CENTER);
imageFilterPanel.addPropertyChangeListener(listener); imageFilterPanel.addPropertyChangeListener(listener);
updateComboBoxes(); updateComboBoxes();
groupSortingComboBox.setSelectedItem(BY_GROUP_SIZE);
pack(); pack();
repaint(); repaint();
} }
@ -129,6 +150,7 @@ final class DiscoveryDialog extends javax.swing.JDialog {
for (FileSearch.GroupingAttributeType type : FileSearch.GroupingAttributeType.getOptionsForGrouping()) { for (FileSearch.GroupingAttributeType type : FileSearch.GroupingAttributeType.getOptionsForGrouping()) {
addTypeToGroupByComboBox(type); addTypeToGroupByComboBox(type);
} }
groupByCombobox.setSelectedItem(PARENT_PATH);
orderByCombobox.removeAllItems(); orderByCombobox.removeAllItems();
// Set up the file order list // Set up the file order list
for (FileSorter.SortingMethod method : FileSorter.SortingMethod.getOptionsForOrdering()) { for (FileSorter.SortingMethod method : FileSorter.SortingMethod.getOptionsForOrdering()) {
@ -136,7 +158,7 @@ final class DiscoveryDialog extends javax.swing.JDialog {
orderByCombobox.addItem(method); orderByCombobox.addItem(method);
} }
} }
groupSortingComboBox.setSelectedIndex(0); orderByCombobox.setSelectedItem(BY_FILE_NAME);
} }
/** /**
@ -531,7 +553,8 @@ final class DiscoveryDialog extends javax.swing.JDialog {
* The adjust the controls to reflect whether the settings are valid based * The adjust the controls to reflect whether the settings are valid based
* on the error. * on the error.
* *
* @param error The error message to display, empty string if there is no error. * @param error The error message to display, empty string if there is no
* error.
*/ */
private void setValid(String error) { private void setValid(String error) {
if (StringUtils.isBlank(error)) { if (StringUtils.isBlank(error)) {
@ -575,7 +598,7 @@ final class DiscoveryDialog extends javax.swing.JDialog {
case DATA_SOURCE_ADDED: case DATA_SOURCE_ADDED:
//fallthrough //fallthrough
case DATA_SOURCE_DELETED: case DATA_SOURCE_DELETED:
updateSearchSettings(); shouldUpdate = true;
break; break;
default: default:
//do nothing if the event is not one of the above events. //do nothing if the event is not one of the above events.
@ -583,4 +606,83 @@ final class DiscoveryDialog extends javax.swing.JDialog {
} }
} }
} }
/**
* PropertyChangeListener to listen to ingest module events that may modify
* the filters available.
*/
private class ModuleChangeListener implements PropertyChangeListener {
@Override
@SuppressWarnings("fallthrough")
public void propertyChange(PropertyChangeEvent evt) {
if (!shouldUpdate) {
String eventType = evt.getPropertyName();
if (eventType.equals(IngestManager.IngestModuleEvent.DATA_ADDED.toString())) {
/**
* Checking for a current case is a stop gap measure until a
* different way of handling the closing of cases is worked
* out. Currently, remote events may be received for a case
* that is already closed.
*/
try {
Case.getCurrentCaseThrows();
/**
* Even with the check above, it is still possible that
* the case will be closed in a different thread before
* this code executes. If that happens, it is possible
* for the event to have a null oldValue.
*/
ModuleDataEvent eventData = (ModuleDataEvent) evt.getOldValue();
if (null != eventData) {
if (eventData.getBlackboardArtifactType().getTypeID() == BlackboardArtifact.ARTIFACT_TYPE.TSK_OBJECT_DETECTED.getTypeID() && eventData.getArtifacts() != null) {
shouldUpdate = shouldUpdateFilters(BlackboardAttribute.ATTRIBUTE_TYPE.TSK_DESCRIPTION.getTypeID(), eventData, objectsDetected);
} else if (eventData.getBlackboardArtifactType().getTypeID() == BlackboardArtifact.ARTIFACT_TYPE.TSK_HASHSET_HIT.getTypeID()) {
shouldUpdate = shouldUpdateFilters(BlackboardAttribute.ATTRIBUTE_TYPE.TSK_SET_NAME.getTypeID(), eventData, hashSets);
} else if (eventData.getBlackboardArtifactType().getTypeID() == BlackboardArtifact.ARTIFACT_TYPE.TSK_INTERESTING_FILE_HIT.getTypeID()
|| eventData.getBlackboardArtifactType().getTypeID() == BlackboardArtifact.ARTIFACT_TYPE.TSK_INTERESTING_ARTIFACT_HIT.getTypeID()) {
shouldUpdate = shouldUpdateFilters(BlackboardAttribute.ATTRIBUTE_TYPE.TSK_SET_NAME.getTypeID(), eventData, interestingItems);
}
}
} catch (NoCurrentCaseException notUsed) {
// Case is closed, do nothing.
} catch (TskCoreException ex) {
logger.log(Level.WARNING, "Unable to determine if discovery UI should be updated", ex);
}
}
}
}
/**
* Helper method to determine if the artifact in the eventData
* represents a new value for the filter.
*
* @param attributeTypeId The attribute id of the attribute which
* contains the value for the filter.
* @param eventData The event which contains the artifacts.
* @param filterSetToCheck The set of current values for the relevant
* filter.
*
* @return True if the value is a new value for the filter, false
* otherwise.
*
* @throws TskCoreException Thrown because the attributes were unable to
* be retrieved for one of the artifacts in the
* eventData.
*/
private boolean shouldUpdateFilters(int attributeTypeId, ModuleDataEvent eventData, Set<BlackboardAttribute> filterSetToCheck) throws TskCoreException {
for (BlackboardArtifact artifact : eventData.getArtifacts()) {
if (artifact.getAttributes() != null) {
for (BlackboardAttribute attr : artifact.getAttributes()) {
if (attr.getAttributeType().getTypeID() == attributeTypeId && !filterSetToCheck.contains(attr)) {
filterSetToCheck.add(attr);
return true;
}
}
}
}
return false;
}
}
} }

View File

@ -22,6 +22,8 @@ import com.google.common.eventbus.Subscribe;
import java.awt.BorderLayout; import java.awt.BorderLayout;
import java.awt.Color; import java.awt.Color;
import java.awt.Graphics; import java.awt.Graphics;
import java.beans.PropertyChangeEvent;
import java.beans.PropertyChangeListener;
import java.util.List; import java.util.List;
import java.util.stream.Collectors; import java.util.stream.Collectors;
import javax.swing.JSplitPane; import javax.swing.JSplitPane;
@ -47,14 +49,12 @@ public final class DiscoveryTopComponent extends TopComponent {
private static final long serialVersionUID = 1L; private static final long serialVersionUID = 1L;
private static final String PREFERRED_ID = "Discovery"; // NON-NLS private static final String PREFERRED_ID = "Discovery"; // NON-NLS
private static final int ANIMATION_INCREMENT = 30;
private volatile static int resultsAreaSize = 250;
private final GroupListPanel groupListPanel; private final GroupListPanel groupListPanel;
private final DetailsPanel detailsPanel; private final DetailsPanel detailsPanel;
private final ResultsPanel resultsPanel; private final ResultsPanel resultsPanel;
private int dividerLocation = -1; private int dividerLocation = -1;
private static final int ANIMATION_INCREMENT = 10;
private static final int RESULTS_AREA_SMALL_SIZE = 250;
private SwingAnimator animator = null; private SwingAnimator animator = null;
/** /**
@ -78,6 +78,19 @@ public final class DiscoveryTopComponent extends TopComponent {
} }
}); });
rightSplitPane.addPropertyChangeListener(JSplitPane.DIVIDER_LOCATION_PROPERTY, new PropertyChangeListener() {
@Override
public void propertyChange(PropertyChangeEvent evt) {
if (evt.getPropertyName().equalsIgnoreCase(JSplitPane.DIVIDER_LOCATION_PROPERTY)) {
//Only change the saved location when it was a manual change by the user and not the animation or the window opening initially
if ((animator == null || !animator.isRunning()) && evt.getNewValue() instanceof Integer
&& ((int) evt.getNewValue() + 5) < (rightSplitPane.getHeight() - rightSplitPane.getDividerSize())) {
resultsAreaSize = (int) evt.getNewValue();
}
}
}
});
} }
/** /**
@ -130,6 +143,7 @@ public final class DiscoveryTopComponent extends TopComponent {
@Override @Override
protected void componentClosed() { protected void componentClosed() {
DiscoveryDialog.getDiscoveryDialogInstance().cancelSearch(); DiscoveryDialog.getDiscoveryDialogInstance().cancelSearch();
DiscoveryEventUtils.getDiscoveryEventBus().post(new DiscoveryEventUtils.ClearInstanceSelectionEvent());
DiscoveryEventUtils.getDiscoveryEventBus().unregister(this); DiscoveryEventUtils.getDiscoveryEventBus().unregister(this);
DiscoveryEventUtils.getDiscoveryEventBus().unregister(groupListPanel); DiscoveryEventUtils.getDiscoveryEventBus().unregister(groupListPanel);
DiscoveryEventUtils.getDiscoveryEventBus().unregister(resultsPanel); DiscoveryEventUtils.getDiscoveryEventBus().unregister(resultsPanel);
@ -245,6 +259,7 @@ public final class DiscoveryTopComponent extends TopComponent {
void handleDetailsVisibleEvent(DiscoveryEventUtils.DetailsVisibleEvent detailsVisibleEvent) { void handleDetailsVisibleEvent(DiscoveryEventUtils.DetailsVisibleEvent detailsVisibleEvent) {
if (animator != null && animator.isRunning()) { if (animator != null && animator.isRunning()) {
animator.stop(); animator.stop();
animator = null;
} }
dividerLocation = rightSplitPane.getDividerLocation(); dividerLocation = rightSplitPane.getDividerLocation();
if (detailsVisibleEvent.isShowDetailsArea()) { if (detailsVisibleEvent.isShowDetailsArea()) {
@ -316,8 +331,9 @@ public final class DiscoveryTopComponent extends TopComponent {
@Override @Override
public boolean hasTerminated() { public boolean hasTerminated() {
if (dividerLocation != JSplitPane.UNDEFINED_CONDITION && dividerLocation < RESULTS_AREA_SMALL_SIZE) { if (dividerLocation != JSplitPane.UNDEFINED_CONDITION && dividerLocation < resultsAreaSize) {
dividerLocation = RESULTS_AREA_SMALL_SIZE; dividerLocation = resultsAreaSize;
animator = null;
return true; return true;
} }
return false; return false;
@ -340,6 +356,7 @@ public final class DiscoveryTopComponent extends TopComponent {
public boolean hasTerminated() { public boolean hasTerminated() {
if (dividerLocation > rightSplitPane.getHeight() || dividerLocation == JSplitPane.UNDEFINED_CONDITION) { if (dividerLocation > rightSplitPane.getHeight() || dividerLocation == JSplitPane.UNDEFINED_CONDITION) {
dividerLocation = rightSplitPane.getHeight(); dividerLocation = rightSplitPane.getHeight();
animator = null;
return true; return true;
} }
return false; return false;
@ -362,8 +379,9 @@ public final class DiscoveryTopComponent extends TopComponent {
@Override @Override
public void paintComponent(Graphics g) { public void paintComponent(Graphics g) {
if ((dividerLocation == JSplitPane.UNDEFINED_CONDITION) || (dividerLocation <= rightSplitPane.getHeight() && dividerLocation >= RESULTS_AREA_SMALL_SIZE)) { if (animator != null && animator.isRunning() && (dividerLocation == JSplitPane.UNDEFINED_CONDITION
rightSplitPane.setDividerLocation(dividerLocation); || (dividerLocation <= getHeight() && dividerLocation >= resultsAreaSize))) {
setDividerLocation(dividerLocation);
} }
super.paintComponent(g); super.paintComponent(g);
} }

View File

@ -34,13 +34,15 @@ final class DocumentFilterPanel extends AbstractFiltersPanel {
DocumentFilterPanel() { DocumentFilterPanel() {
super(); super();
initComponents(); initComponents();
addFilter(new SizeFilterPanel(FileSearchData.FileType.DOCUMENTS), false, null, 0); SizeFilterPanel sizeFilterPanel = new SizeFilterPanel(FILE_TYPE);
int[] sizeIndicesSelected = {3, 4, 5};
addFilter(sizeFilterPanel, true, sizeIndicesSelected, 0);
addFilter(new DataSourceFilterPanel(), false, null, 0); addFilter(new DataSourceFilterPanel(), false, null, 0);
int[] pastOccurrencesIndices; int[] pastOccurrencesIndices;
if (!CentralRepository.isEnabled()) { if (!CentralRepository.isEnabled()) {
pastOccurrencesIndices = new int[]{0}; pastOccurrencesIndices = new int[]{0};
} else { } else {
pastOccurrencesIndices = new int[]{1, 2, 3, 4, 5, 6, 7}; pastOccurrencesIndices = new int[]{2, 3, 4};
} }
addFilter(new PastOccurrencesFilterPanel(), true, pastOccurrencesIndices, 0); addFilter(new PastOccurrencesFilterPanel(), true, pastOccurrencesIndices, 0);
addFilter(new HashSetFilterPanel(), false, null, 1); addFilter(new HashSetFilterPanel(), false, null, 1);

View File

@ -63,7 +63,11 @@ final class GroupListPanel extends javax.swing.JPanel {
groupKeyList.setListData(new GroupKey[0]); groupKeyList.setListData(new GroupKey[0]);
} }
@Messages({"GroupsListPanel.noResults.message.text=No results were found for the selected filters.", @Messages({"GroupsListPanel.noResults.message.text=No results were found for the selected filters.\n\n"
+ "Reminder:\n"
+ " -The File Type Identification module must be run on each data source you want to find results in.\n"
+ " -The Hash Lookup module must be run on each data source if you want to filter by past occurrence.\n"
+ " -The Exif module must be run on each data source if you are filtering by User Created content.",
"GroupsListPanel.noResults.title.text=No results found"}) "GroupsListPanel.noResults.title.text=No results found"})
/** /**
* Subscribe to and update list of groups in response to * Subscribe to and update list of groups in response to
@ -86,7 +90,7 @@ final class GroupListPanel extends javax.swing.JPanel {
JOptionPane.showMessageDialog(DiscoveryTopComponent.getTopComponent(), JOptionPane.showMessageDialog(DiscoveryTopComponent.getTopComponent(),
Bundle.GroupsListPanel_noResults_message_text(), Bundle.GroupsListPanel_noResults_message_text(),
Bundle.GroupsListPanel_noResults_title_text(), Bundle.GroupsListPanel_noResults_title_text(),
JOptionPane.INFORMATION_MESSAGE); JOptionPane.PLAIN_MESSAGE);
} }
setCursor(Cursor.getPredefinedCursor(Cursor.DEFAULT_CURSOR)); setCursor(Cursor.getPredefinedCursor(Cursor.DEFAULT_CURSOR));
}); });

View File

@ -35,14 +35,14 @@ final class ImageFilterPanel extends AbstractFiltersPanel {
super(); super();
initComponents(); initComponents();
SizeFilterPanel sizeFilterPanel = new SizeFilterPanel(FILE_TYPE); SizeFilterPanel sizeFilterPanel = new SizeFilterPanel(FILE_TYPE);
int[] sizeIndicesSelected = {1, 2, 3, 4, 5}; int[] sizeIndicesSelected = {3, 4, 5};
addFilter(sizeFilterPanel, true, sizeIndicesSelected, 0); addFilter(sizeFilterPanel, true, sizeIndicesSelected, 0);
addFilter(new DataSourceFilterPanel(), false, null, 0); addFilter(new DataSourceFilterPanel(), false, null, 0);
int[] pastOccurrencesIndices; int[] pastOccurrencesIndices;
if (!CentralRepository.isEnabled()) { if (!CentralRepository.isEnabled()) {
pastOccurrencesIndices = new int[]{0}; pastOccurrencesIndices = new int[]{0};
} else { } else {
pastOccurrencesIndices = new int[]{1, 2, 3, 4, 5, 6, 7}; pastOccurrencesIndices = new int[]{2, 3, 4};
} }
addFilter(new PastOccurrencesFilterPanel(), true, pastOccurrencesIndices, 0); addFilter(new PastOccurrencesFilterPanel(), true, pastOccurrencesIndices, 0);
addFilter(new UserCreatedFilterPanel(), false, null, 1); addFilter(new UserCreatedFilterPanel(), false, null, 1);

View File

@ -3,7 +3,7 @@
<Form version="1.5" maxVersion="1.9" type="org.netbeans.modules.form.forminfo.JPanelFormInfo"> <Form version="1.5" maxVersion="1.9" type="org.netbeans.modules.form.forminfo.JPanelFormInfo">
<Properties> <Properties>
<Property name="minimumSize" type="java.awt.Dimension" editor="org.netbeans.beaninfo.editors.DimensionEditor"> <Property name="minimumSize" type="java.awt.Dimension" editor="org.netbeans.beaninfo.editors.DimensionEditor">
<Dimension value="[700, 200]"/> <Dimension value="[300, 60]"/>
</Property> </Property>
<Property name="preferredSize" type="java.awt.Dimension" editor="org.netbeans.beaninfo.editors.DimensionEditor"> <Property name="preferredSize" type="java.awt.Dimension" editor="org.netbeans.beaninfo.editors.DimensionEditor">
<Dimension value="[700, 700]"/> <Dimension value="[700, 700]"/>
@ -315,7 +315,7 @@
<Container class="javax.swing.JPanel" name="resultsViewerPanel"> <Container class="javax.swing.JPanel" name="resultsViewerPanel">
<Properties> <Properties>
<Property name="minimumSize" type="java.awt.Dimension" editor="org.netbeans.beaninfo.editors.DimensionEditor"> <Property name="minimumSize" type="java.awt.Dimension" editor="org.netbeans.beaninfo.editors.DimensionEditor">
<Dimension value="[0, 160]"/> <Dimension value="[0, 60]"/>
</Property> </Property>
<Property name="preferredSize" type="java.awt.Dimension" editor="org.netbeans.beaninfo.editors.DimensionEditor"> <Property name="preferredSize" type="java.awt.Dimension" editor="org.netbeans.beaninfo.editors.DimensionEditor">
<Dimension value="[700, 700]"/> <Dimension value="[700, 700]"/>

View File

@ -376,7 +376,7 @@ final class ResultsPanel extends javax.swing.JPanel {
javax.swing.Box.Filler filler4 = new javax.swing.Box.Filler(new java.awt.Dimension(0, 0), new java.awt.Dimension(0, 0), new java.awt.Dimension(32767, 0)); javax.swing.Box.Filler filler4 = new javax.swing.Box.Filler(new java.awt.Dimension(0, 0), new java.awt.Dimension(0, 0), new java.awt.Dimension(32767, 0));
resultsViewerPanel = new javax.swing.JPanel(); resultsViewerPanel = new javax.swing.JPanel();
setMinimumSize(new java.awt.Dimension(700, 200)); setMinimumSize(new java.awt.Dimension(300, 60));
setPreferredSize(new java.awt.Dimension(700, 700)); setPreferredSize(new java.awt.Dimension(700, 700));
setLayout(new java.awt.BorderLayout()); setLayout(new java.awt.BorderLayout());
@ -533,7 +533,7 @@ final class ResultsPanel extends javax.swing.JPanel {
add(pagingPanel, java.awt.BorderLayout.PAGE_START); add(pagingPanel, java.awt.BorderLayout.PAGE_START);
resultsViewerPanel.setMinimumSize(new java.awt.Dimension(0, 160)); resultsViewerPanel.setMinimumSize(new java.awt.Dimension(0, 60));
resultsViewerPanel.setPreferredSize(new java.awt.Dimension(700, 700)); resultsViewerPanel.setPreferredSize(new java.awt.Dimension(700, 700));
resultsViewerPanel.setLayout(new java.awt.BorderLayout()); resultsViewerPanel.setLayout(new java.awt.BorderLayout());
add(resultsViewerPanel, java.awt.BorderLayout.CENTER); add(resultsViewerPanel, java.awt.BorderLayout.CENTER);

View File

@ -39,7 +39,7 @@ final class SwingAnimator {
private Timer timer = null; private Timer timer = null;
//duration in milliseconds betweeen each firing of the Timer //duration in milliseconds betweeen each firing of the Timer
private static final int INITIAL_TIMING = 10; private static final int INITIAL_TIMING = 30;
private int timing = INITIAL_TIMING; private int timing = INITIAL_TIMING;
/** /**

View File

@ -34,13 +34,15 @@ final class VideoFilterPanel extends AbstractFiltersPanel {
VideoFilterPanel() { VideoFilterPanel() {
super(); super();
initComponents(); initComponents();
addFilter(new SizeFilterPanel(FileSearchData.FileType.VIDEO), false, null, 0); SizeFilterPanel sizeFilterPanel = new SizeFilterPanel(FILE_TYPE);
int[] sizeIndicesSelected = {3, 4, 5};
addFilter(sizeFilterPanel, true, sizeIndicesSelected, 0);
addFilter(new DataSourceFilterPanel(), false, null, 0); addFilter(new DataSourceFilterPanel(), false, null, 0);
int[] pastOccurrencesIndices; int[] pastOccurrencesIndices;
if (!CentralRepository.isEnabled()) { if (!CentralRepository.isEnabled()) {
pastOccurrencesIndices = new int[]{0}; pastOccurrencesIndices = new int[]{0};
} else { } else {
pastOccurrencesIndices = new int[]{1, 2, 3, 4, 5, 6, 7}; pastOccurrencesIndices = new int[]{2, 3, 4};
} }
addFilter(new PastOccurrencesFilterPanel(), true, pastOccurrencesIndices, 0); addFilter(new PastOccurrencesFilterPanel(), true, pastOccurrencesIndices, 0);
addFilter(new UserCreatedFilterPanel(), false, null, 1); addFilter(new UserCreatedFilterPanel(), false, null, 1);

View File

@ -501,9 +501,9 @@ class GeoFilterPanel extends javax.swing.JPanel {
DataSource dataSource, BlackboardArtifact.ARTIFACT_TYPE artifactType) throws TskCoreException { DataSource dataSource, BlackboardArtifact.ARTIFACT_TYPE artifactType) throws TskCoreException {
long count = 0; long count = 0;
String queryStr String queryStr
= "SELECT count(DISTINCT artifact_id) AS count FROM" = "SELECT count(DISTINCT artIds) AS count FROM"
+ " (" + " ("
+ " SELECT * FROM blackboard_artifacts as arts" + " SELECT arts.artifact_id as artIds, * FROM blackboard_artifacts as arts"
+ " INNER JOIN blackboard_attributes as attrs" + " INNER JOIN blackboard_attributes as attrs"
+ " ON attrs.artifact_id = arts.artifact_id" + " ON attrs.artifact_id = arts.artifact_id"
+ " WHERE arts.artifact_type_id = " + artifactType.getTypeID() + " WHERE arts.artifact_type_id = " + artifactType.getTypeID()
@ -516,7 +516,7 @@ class GeoFilterPanel extends javax.swing.JPanel {
+ " or attrs.attribute_type_id = " + BlackboardAttribute.ATTRIBUTE_TYPE.TSK_GEO_TRACKPOINTS.getTypeID() + " or attrs.attribute_type_id = " + BlackboardAttribute.ATTRIBUTE_TYPE.TSK_GEO_TRACKPOINTS.getTypeID()
+ " or attrs.attribute_type_id = " + BlackboardAttribute.ATTRIBUTE_TYPE.TSK_GEO_WAYPOINTS.getTypeID() + " or attrs.attribute_type_id = " + BlackboardAttribute.ATTRIBUTE_TYPE.TSK_GEO_WAYPOINTS.getTypeID()
+ " )" + " )"
+ " )"; + " ) as innerTable";
try (SleuthkitCase.CaseDbQuery queryResult = sleuthkitCase.executeQuery(queryStr); try (SleuthkitCase.CaseDbQuery queryResult = sleuthkitCase.executeQuery(queryStr);
ResultSet resultSet = queryResult.getResultSet()) { ResultSet resultSet = queryResult.getResultSet()) {
if (resultSet.next()) { if (resultSet.next()) {

View File

@ -23,8 +23,6 @@ HashDbIngestModule.lookingUpKnownBadHashValueErr=Error encountered while looking
HashDbIngestModule.lookingUpKnownHashValueErr=Error encountered while looking up known hash value for {0}. HashDbIngestModule.lookingUpKnownHashValueErr=Error encountered while looking up known hash value for {0}.
# {0} - fileName # {0} - fileName
HashDbIngestModule.lookingUpNoChangeHashValueErr=Error encountered while looking up no change hash value for {0}. HashDbIngestModule.lookingUpNoChangeHashValueErr=Error encountered while looking up no change hash value for {0}.
HashDbIngestModule.noChangeFileSearchWillNotExecuteWarn='No Change' file search will not be executed.
HashDbIngestModule.noChangeHashDbSetMsg=No 'No Change' hash set.
HashDbIngestModule.noKnownBadHashDbSetMsg=No notable hash set. HashDbIngestModule.noKnownBadHashDbSetMsg=No notable hash set.
HashDbIngestModule.noKnownHashDbSetMsg=No known hash set. HashDbIngestModule.noKnownHashDbSetMsg=No known hash set.
HashDbManager.CentralRepoHashDb.orgError=Error loading organization HashDbManager.CentralRepoHashDb.orgError=Error loading organization
@ -34,6 +32,8 @@ HashDbManager.knownBad.text=Notable
HashDbManager.noChange.text=No Change HashDbManager.noChange.text=No Change
# {0} - hash set name # {0} - hash set name
HashDbManager.noDbPath.message=Couldn't get valid hash set path for: {0} HashDbManager.noDbPath.message=Couldn't get valid hash set path for: {0}
# {0} - hashSetName
HashDbManager_handleNameConflict_conflictSuffix={0} (Custom)
HashDbSearchAction.noOpenCase.errMsg=No open case available. HashDbSearchAction.noOpenCase.errMsg=No open case available.
HashDbSearchPanel.noOpenCase.errMsg=No open case available. HashDbSearchPanel.noOpenCase.errMsg=No open case available.
HashLookupSettingsPanel.centralRepo=Central Repository HashLookupSettingsPanel.centralRepo=Central Repository

View File

@ -22,21 +22,28 @@ import java.beans.PropertyChangeEvent;
import java.beans.PropertyChangeListener; import java.beans.PropertyChangeListener;
import java.beans.PropertyChangeSupport; import java.beans.PropertyChangeSupport;
import java.io.File; import java.io.File;
import java.io.FilenameFilter;
import java.io.IOException; import java.io.IOException;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.HashSet; import java.util.HashSet;
import java.util.List; import java.util.List;
import java.util.MissingResourceException;
import java.util.Objects; import java.util.Objects;
import java.util.Set; import java.util.Set;
import java.util.concurrent.ExecutionException; import java.util.concurrent.ExecutionException;
import java.util.logging.Level; import java.util.logging.Level;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import java.util.stream.Collectors;
import java.util.stream.Stream; import java.util.stream.Stream;
import javax.swing.JFileChooser; import javax.swing.JFileChooser;
import javax.swing.JOptionPane; import javax.swing.JOptionPane;
import javax.swing.SwingWorker; import javax.swing.SwingWorker;
import javax.swing.filechooser.FileNameExtensionFilter; import javax.swing.filechooser.FileNameExtensionFilter;
import org.apache.commons.io.FilenameUtils; import org.apache.commons.io.FilenameUtils;
import org.apache.commons.lang.StringUtils;
import org.netbeans.api.progress.ProgressHandle; import org.netbeans.api.progress.ProgressHandle;
import org.openide.modules.InstalledFileLocator;
import org.openide.util.NbBundle; import org.openide.util.NbBundle;
import org.openide.util.NbBundle.Messages; import org.openide.util.NbBundle.Messages;
import org.openide.windows.WindowManager; import org.openide.windows.WindowManager;
@ -71,10 +78,29 @@ public class HashDbManager implements PropertyChangeListener {
private List<HashDb> hashSets = new ArrayList<>(); private List<HashDb> hashSets = new ArrayList<>();
private Set<String> hashSetNames = new HashSet<>(); private Set<String> hashSetNames = new HashSet<>();
private Set<String> hashSetPaths = new HashSet<>(); private Set<String> hashSetPaths = new HashSet<>();
private List<HashDb> officialHashSets = new ArrayList<>();
private Set<String> officialHashSetNames = new HashSet<>();
private Set<String> officialHashSetPaths = new HashSet<>();
PropertyChangeSupport changeSupport = new PropertyChangeSupport(HashDbManager.class); PropertyChangeSupport changeSupport = new PropertyChangeSupport(HashDbManager.class);
private static final Logger logger = Logger.getLogger(HashDbManager.class.getName()); private static final Logger logger = Logger.getLogger(HashDbManager.class.getName());
private boolean allDatabasesLoadedCorrectly = false; private boolean allDatabasesLoadedCorrectly = false;
private static final String OFFICIAL_HASH_SETS_FOLDER = "OfficialHashSets";
private static final String KDB_EXT = "kdb";
private static final String DB_NAME_PARAM = "dbName";
private static final String KNOWN_STATUS_PARAM = "knownStatus";
private static final Pattern OFFICIAL_FILENAME = Pattern.compile("(?<" + DB_NAME_PARAM + ">.+?)\\.(?<" + KNOWN_STATUS_PARAM + ">.+?)\\." + KDB_EXT);
private static final FilenameFilter DEFAULT_KDB_FILTER = new FilenameFilter() {
@Override
public boolean accept(File dir, String name) {
return name.endsWith("." + KDB_EXT);
}
};
/** /**
* Property change event support In events: For both of these enums, the old * Property change event support In events: For both of these enums, the old
* value should be null, and the new value should be the hashset name * value should be null, and the new value should be the hashset name
@ -168,13 +194,7 @@ public class HashDbManager implements PropertyChangeListener {
throw new HashDbManagerException(NbBundle.getMessage(HashDbManager.class, "HashDbManager.hashDbDoesNotExistExceptionMsg", path)); throw new HashDbManagerException(NbBundle.getMessage(HashDbManager.class, "HashDbManager.hashDbDoesNotExistExceptionMsg", path));
} }
if (hashSetPaths.contains(path)) { checkDbCollision(path, hashSetName);
throw new HashDbManagerException(NbBundle.getMessage(HashDbManager.class, "HashDbManager.hashDbAlreadyAddedExceptionMsg", path));
}
if (hashSetNames.contains(hashSetName)) {
throw new HashDbManagerException(NbBundle.getMessage(HashDbManager.class, "HashDbManager.duplicateHashSetNameExceptionMsg", hashSetName));
}
hashDb = addHashDatabase(SleuthkitJNI.openHashDatabase(path), hashSetName, searchDuringIngest, sendIngestMessages, knownFilesType); hashDb = addHashDatabase(SleuthkitJNI.openHashDatabase(path), hashSetName, searchDuringIngest, sendIngestMessages, knownFilesType);
} catch (TskCoreException ex) { } catch (TskCoreException ex) {
@ -225,13 +245,7 @@ public class HashDbManager implements PropertyChangeListener {
getHashDatabaseFileExtension())); getHashDatabaseFileExtension()));
} }
if (hashSetPaths.contains(path)) { checkDbCollision(path, hashSetName);
throw new HashDbManagerException(NbBundle.getMessage(HashDbManager.class, "HashDbManager.hashDbAlreadyAddedExceptionMsg", path));
}
if (hashSetNames.contains(hashSetName)) {
throw new HashDbManagerException(NbBundle.getMessage(HashDbManager.class, "HashDbManager.duplicateHashSetNameExceptionMsg", hashSetName));
}
hashDb = addHashDatabase(SleuthkitJNI.createHashDatabase(path), hashSetName, searchDuringIngest, sendIngestMessages, knownFilesType); hashDb = addHashDatabase(SleuthkitJNI.createHashDatabase(path), hashSetName, searchDuringIngest, sendIngestMessages, knownFilesType);
} catch (TskCoreException ex) { } catch (TskCoreException ex) {
@ -240,6 +254,27 @@ public class HashDbManager implements PropertyChangeListener {
return hashDb; return hashDb;
} }
/**
* Throws an exception if the provided path or hashSetName already belong to
* an existing database.
*
* @param path The path.
* @param hashSetName The hash set name.
*
* @throws
* org.sleuthkit.autopsy.modules.hashdatabase.HashDbManager.HashDbManagerException
* @throws MissingResourceException
*/
private void checkDbCollision(String path, String hashSetName) throws HashDbManagerException, MissingResourceException {
if (hashSetPaths.contains(path) || officialHashSetPaths.contains(path)) {
throw new HashDbManagerException(NbBundle.getMessage(HashDbManager.class, "HashDbManager.hashDbAlreadyAddedExceptionMsg", path));
}
if (hashSetNames.contains(hashSetName) || officialHashSetNames.contains(hashSetName)) {
throw new HashDbManagerException(NbBundle.getMessage(HashDbManager.class, "HashDbManager.duplicateHashSetNameExceptionMsg", hashSetName));
}
}
private SleuthkitHashSet addHashDatabase(int handle, String hashSetName, boolean searchDuringIngest, boolean sendIngestMessages, HashDb.KnownFilesType knownFilesType) throws TskCoreException { private SleuthkitHashSet addHashDatabase(int handle, String hashSetName, boolean searchDuringIngest, boolean sendIngestMessages, HashDb.KnownFilesType knownFilesType) throws TskCoreException {
// Wrap an object around the handle. // Wrap an object around the handle.
SleuthkitHashSet hashDb = new SleuthkitHashSet(handle, hashSetName, searchDuringIngest, sendIngestMessages, knownFilesType); SleuthkitHashSet hashDb = new SleuthkitHashSet(handle, hashSetName, searchDuringIngest, sendIngestMessages, knownFilesType);
@ -420,9 +455,8 @@ public class HashDbManager implements PropertyChangeListener {
Logger.getLogger(HashDbManager.class.getName()).log(Level.SEVERE, "Error loading central repository hash sets", ex); //NON-NLS Logger.getLogger(HashDbManager.class.getName()).log(Level.SEVERE, "Error loading central repository hash sets", ex); //NON-NLS
} }
List<HashDb> hashDbs = new ArrayList<>(); return Stream.concat(this.officialHashSets.stream(), this.hashSets.stream())
hashDbs.addAll(this.hashSets); .collect(Collectors.toList());
return hashDbs;
} }
/** /**
@ -431,16 +465,10 @@ public class HashDbManager implements PropertyChangeListener {
* @return A list, possibly empty, of hash databases. * @return A list, possibly empty, of hash databases.
*/ */
public synchronized List<HashDb> getKnownFileHashSets() { public synchronized List<HashDb> getKnownFileHashSets() {
List<HashDb> hashDbs = new ArrayList<>(); return getAllHashSets()
try { .stream()
updateHashSetsFromCentralRepository(); .filter((db) -> (db.getKnownFilesType() == HashDb.KnownFilesType.KNOWN))
} catch (TskCoreException ex) { .collect(Collectors.toList());
Logger.getLogger(HashDbManager.class.getName()).log(Level.SEVERE, "Error loading central repository hash sets", ex); //NON-NLS
}
this.hashSets.stream().filter((db) -> (db.getKnownFilesType() == HashDb.KnownFilesType.KNOWN)).forEach((db) -> {
hashDbs.add(db);
});
return hashDbs;
} }
/** /**
@ -449,16 +477,10 @@ public class HashDbManager implements PropertyChangeListener {
* @return A list, possibly empty, of hash databases. * @return A list, possibly empty, of hash databases.
*/ */
public synchronized List<HashDb> getKnownBadFileHashSets() { public synchronized List<HashDb> getKnownBadFileHashSets() {
List<HashDb> hashDbs = new ArrayList<>(); return getAllHashSets()
try { .stream()
updateHashSetsFromCentralRepository(); .filter((db) -> (db.getKnownFilesType() == HashDb.KnownFilesType.KNOWN_BAD))
} catch (TskCoreException ex) { .collect(Collectors.toList());
Logger.getLogger(HashDbManager.class.getName()).log(Level.SEVERE, "Error loading central repository hash sets", ex); //NON-NLS
}
this.hashSets.stream().filter((db) -> (db.getKnownFilesType() == HashDb.KnownFilesType.KNOWN_BAD)).forEach((db) -> {
hashDbs.add(db);
});
return hashDbs;
} }
/** /**
@ -467,26 +489,21 @@ public class HashDbManager implements PropertyChangeListener {
* @return A list, possibly empty, of hash databases. * @return A list, possibly empty, of hash databases.
*/ */
public synchronized List<HashDb> getUpdateableHashSets() { public synchronized List<HashDb> getUpdateableHashSets() {
return getUpdateableHashSets(this.hashSets); return getUpdateableHashSets(getAllHashSets());
} }
private List<HashDb> getUpdateableHashSets(List<HashDb> hashDbs) { private List<HashDb> getUpdateableHashSets(List<HashDb> hashDbs) {
ArrayList<HashDb> updateableDbs = new ArrayList<>(); return hashDbs
try { .stream()
updateHashSetsFromCentralRepository(); .filter((HashDb db) -> {
} catch (TskCoreException ex) { try {
Logger.getLogger(HashDbManager.class.getName()).log(Level.SEVERE, "Error loading central repository hash sets", ex); //NON-NLS return db.isUpdateable();
} } catch (TskCoreException ex) {
for (HashDb db : hashDbs) { Logger.getLogger(HashDbManager.class.getName()).log(Level.SEVERE, "Error checking updateable status of " + db.getHashSetName() + " hash set", ex); //NON-NLS
try { return false;
if (db.isUpdateable()) { }
updateableDbs.add(db); })
} .collect(Collectors.toList());
} catch (TskCoreException ex) {
Logger.getLogger(HashDbManager.class.getName()).log(Level.SEVERE, "Error checking updateable status of " + db.getHashSetName() + " hash set", ex); //NON-NLS
}
}
return updateableDbs;
} }
private List<HashDbInfo> getCentralRepoHashSetsFromDatabase() { private List<HashDbInfo> getCentralRepoHashSetsFromDatabase() {
@ -536,66 +553,197 @@ public class HashDbManager implements PropertyChangeListener {
} }
private void loadHashsetsConfiguration() { private void loadHashsetsConfiguration() {
loadOfficialHashSets();
try { try {
HashLookupSettings settings = HashLookupSettings.readSettings(); HashLookupSettings settings = HashLookupSettings.readSettings();
this.configureSettings(settings); this.configureSettings(settings, officialHashSetNames);
} catch (HashLookupSettings.HashLookupSettingsException ex) { } catch (HashLookupSettings.HashLookupSettingsException ex) {
Logger.getLogger(HashDbManager.class.getName()).log(Level.SEVERE, "Could not read Hash lookup settings from disk.", ex); Logger.getLogger(HashDbManager.class.getName()).log(Level.SEVERE, "Could not read Hash lookup settings from disk.", ex);
} }
} }
/**
* Loads official hash sets into officialHashSets and also populates
* officialHashSetPaths and officialHashSetNames variables.
*/
private void loadOfficialHashSets() {
officialHashSetPaths = new HashSet<>();
officialHashSetNames = new HashSet<>();
try {
officialHashSets = loadOfficialHashSetsFromFolder(OFFICIAL_HASH_SETS_FOLDER);
officialHashSets.forEach(db -> {
officialHashSetNames.add(db.getHashSetName());
try {
String databasePath = db.getDatabasePath();
String indexPath = db.getIndexPath();
if (StringUtils.isNotBlank(databasePath) && !databasePath.equals("None")) { //NON-NLS
officialHashSetPaths.add(databasePath);
}
if (StringUtils.isNotBlank(indexPath) && !indexPath.equals("None")) { //NON-NLS
officialHashSetPaths.add(indexPath);
}
} catch (TskCoreException ex) {
logger.log(Level.SEVERE, "There was an error loading the official hash set name.", ex);
}
});
} catch (HashDbManagerException ex) {
logger.log(Level.WARNING, "There was an error loading the official hash sets.", ex);
officialHashSets = new ArrayList<HashDb>();
}
}
/**
* Handles a potential conflict between official and non-official hash sets.
* Non-official hashsets have '(Custom)' added. If a conflict is identified,
* the hashset settings are fixed, saved, reloaded, and returned. Otherwise,
* the original list is returned.
*
* @param curHashsets The list of non-official hash sets.
* @param officialNames The set of names for official hash sets.
*
* @return The new list of non-official hash sets with conflicts removed.
*/
@Messages({
"# {0} - hashSetName",
"HashDbManager_handleNameConflict_conflictSuffix={0} (Custom)"
})
private List<HashDbInfo> handleNameConflict(List<HashDbInfo> curHashsets, Set<String> officialNames) {
Set<String> curNames = new HashSet<String>(officialNames);
boolean change = false;
List<HashDbInfo> newItems = new ArrayList<>();
for (HashDbInfo hashset : curHashsets) {
String thisName = hashset.getHashSetName();
if (curNames.contains(thisName)) {
while (curNames.contains(thisName)) {
thisName = Bundle.HashDbManager_handleNameConflict_conflictSuffix(thisName);
}
newItems.add(new HashDbInfo(
thisName,
hashset.getKnownFilesType(),
hashset.getSearchDuringIngest(),
hashset.getSendIngestMessages(),
hashset.getPath(),
hashset.getReferenceSetID(),
hashset.getVersion(),
hashset.isReadOnly(),
hashset.isCentralRepoDatabaseType()
));
change = true;
} else {
newItems.add(hashset);
}
curNames.add(thisName);
}
if (!change) {
return curHashsets;
} else {
try {
HashLookupSettings.writeSettings(new HashLookupSettings(newItems));
HashLookupSettings toRet = HashLookupSettings.readSettings();
return toRet.getHashDbInfo();
} catch (HashLookupSettings.HashLookupSettingsException ex) {
logger.log(Level.SEVERE, "There was an error while trying to resave after name conflict.", ex);
return newItems;
}
}
}
/**
* Loads official hash sets from the given folder.
*
* @param folder The folder from which to load official hash sets.
*
* @return The List of found hash sets.
*
* @throws HashDbManagerException If folder does not exist.
*/
private List<HashDb> loadOfficialHashSetsFromFolder(String folder) throws HashDbManagerException {
File configFolder = InstalledFileLocator.getDefault().locate(
folder, HashDbManager.class.getPackage().getName(), false);
if (configFolder == null || !configFolder.exists() || !configFolder.isDirectory()) {
throw new HashDbManagerException("Folder provided: " + folder + " does not exist.");
}
return Stream.of(configFolder.listFiles(DEFAULT_KDB_FILTER))
.map((f) -> {
try {
return getOfficialHashDbFromFile(f);
} catch (HashDbManagerException | TskCoreException ex) {
logger.log(Level.WARNING, String.format("Hashset: %s could not be properly read.", f.getAbsolutePath()), ex);
return null;
}
})
.filter((hashdb) -> hashdb != null)
.collect(Collectors.toList());
}
/**
* Loads an official hash set from the given file.
*
* @param file The kdb file to load.
*
* @return The HashDbInfo of the official set.
*
* @throws HashDbManagerException If file does not exist or does not match
* naming convention (See
* HashDbManager.OFFICIAL_FILENAME for
* regex).
*/
private HashDb getOfficialHashDbFromFile(File file) throws HashDbManagerException, TskCoreException {
if (file == null || !file.exists()) {
throw new HashDbManagerException(String.format("No file found for: %s", file == null ? "<null>" : file.getAbsolutePath()));
}
String filename = file.getName();
Matcher match = OFFICIAL_FILENAME.matcher(filename);
if (!match.find()) {
throw new HashDbManagerException(String.format("File with name: %s does not match regex of: %s", filename, OFFICIAL_FILENAME.toString()));
}
String hashdbName = match.group(DB_NAME_PARAM);
final String knownStatus = match.group(KNOWN_STATUS_PARAM);
KnownFilesType knownFilesType = Stream.of(HashDb.KnownFilesType.values())
.filter(k -> k.getIdentifier().toUpperCase().equals(knownStatus.toUpperCase()))
.findFirst()
.orElseThrow(() -> new HashDbManagerException(String.format("No KnownFilesType matches %s for file: %s", knownStatus, filename)));
return new SleuthkitHashSet(
SleuthkitJNI.openHashDatabase(file.getAbsolutePath()),
hashdbName,
true, //searchDuringIngest
false, //sendIngestMessages
knownFilesType,
true); // official set
}
/** /**
* Configures the given settings object by adding all contained hash db to * Configures the given settings object by adding all contained hash db to
* the system. * the system.
* *
* @param settings The settings to configure. * @param settings The settings to configure.
* @param officialSetNames The official set names. Any name collisions will
* trigger rename for primary file.
*/ */
@Messages({"# {0} - hash set name", "HashDbManager.noDbPath.message=Couldn't get valid hash set path for: {0}", @Messages({"# {0} - hash set name", "HashDbManager.noDbPath.message=Couldn't get valid hash set path for: {0}",
"HashDbManager.centralRepoLoadError.message=Error loading central repository hash sets"}) "HashDbManager.centralRepoLoadError.message=Error loading central repository hash sets"})
private void configureSettings(HashLookupSettings settings) { private void configureSettings(HashLookupSettings settings, Set<String> officialSetNames) {
allDatabasesLoadedCorrectly = true; allDatabasesLoadedCorrectly = true;
List<HashDbInfo> hashDbInfoList = settings.getHashDbInfo(); List<HashDbInfo> hashDbInfoList = settings.getHashDbInfo();
hashDbInfoList = handleNameConflict(hashDbInfoList, officialSetNames);
for (HashDbInfo hashDbInfo : hashDbInfoList) { for (HashDbInfo hashDbInfo : hashDbInfoList) {
try { configureLocalDb(hashDbInfo);
if (hashDbInfo.isFileDatabaseType()) {
String dbPath = this.getValidFilePath(hashDbInfo.getHashSetName(), hashDbInfo.getPath());
if (dbPath != null) {
addHashDatabase(SleuthkitJNI.openHashDatabase(dbPath), hashDbInfo.getHashSetName(), hashDbInfo.getSearchDuringIngest(), hashDbInfo.getSendIngestMessages(), hashDbInfo.getKnownFilesType());
} else {
logger.log(Level.WARNING, Bundle.HashDbManager_noDbPath_message(hashDbInfo.getHashSetName()));
allDatabasesLoadedCorrectly = false;
}
} else {
if (CentralRepository.isEnabled()) {
addExistingCentralRepoHashSet(hashDbInfo.getHashSetName(), hashDbInfo.getVersion(),
hashDbInfo.getReferenceSetID(),
hashDbInfo.getSearchDuringIngest(), hashDbInfo.getSendIngestMessages(),
hashDbInfo.getKnownFilesType(), hashDbInfo.isReadOnly());
}
}
} catch (TskCoreException ex) {
Logger.getLogger(HashDbManager.class.getName()).log(Level.SEVERE, "Error opening hash set", ex); //NON-NLS
JOptionPane.showMessageDialog(WindowManager.getDefault().getMainWindow(),
NbBundle.getMessage(this.getClass(),
"HashDbManager.unableToOpenHashDbMsg", hashDbInfo.getHashSetName()),
NbBundle.getMessage(this.getClass(), "HashDbManager.openHashDbErr"),
JOptionPane.ERROR_MESSAGE);
allDatabasesLoadedCorrectly = false;
}
} }
if (CentralRepository.isEnabled()) { if (CentralRepository.isEnabled()) {
try { configureCrDbs();
updateHashSetsFromCentralRepository();
} catch (TskCoreException ex) {
Logger.getLogger(HashDbManager.class.getName()).log(Level.SEVERE, "Error opening hash set", ex); //NON-NLS
JOptionPane.showMessageDialog(WindowManager.getDefault().getMainWindow(),
Bundle.HashDbManager_centralRepoLoadError_message(),
NbBundle.getMessage(this.getClass(), "HashDbManager.openHashDbErr"),
JOptionPane.ERROR_MESSAGE);
allDatabasesLoadedCorrectly = false;
}
} }
/* /*
@ -619,6 +767,56 @@ public class HashDbManager implements PropertyChangeListener {
} }
} }
/**
* Configures central repository hash set databases.
*/
private void configureCrDbs() {
try {
updateHashSetsFromCentralRepository();
} catch (TskCoreException ex) {
Logger.getLogger(HashDbManager.class.getName()).log(Level.SEVERE, "Error opening hash set", ex); //NON-NLS
JOptionPane.showMessageDialog(WindowManager.getDefault().getMainWindow(),
Bundle.HashDbManager_centralRepoLoadError_message(),
NbBundle.getMessage(this.getClass(), "HashDbManager.openHashDbErr"),
JOptionPane.ERROR_MESSAGE);
allDatabasesLoadedCorrectly = false;
}
}
/**
* Handles configuring a local hash set database.
* @param hashDbInfo The local hash set database.
*/
private void configureLocalDb(HashDbInfo hashDbInfo) {
try {
if (hashDbInfo.isFileDatabaseType()) {
String dbPath = this.getValidFilePath(hashDbInfo.getHashSetName(), hashDbInfo.getPath());
if (dbPath != null) {
addHashDatabase(SleuthkitJNI.openHashDatabase(dbPath), hashDbInfo.getHashSetName(), hashDbInfo.getSearchDuringIngest(), hashDbInfo.getSendIngestMessages(), hashDbInfo.getKnownFilesType());
} else {
logger.log(Level.WARNING, Bundle.HashDbManager_noDbPath_message(hashDbInfo.getHashSetName()));
allDatabasesLoadedCorrectly = false;
}
} else {
if (CentralRepository.isEnabled()) {
addExistingCentralRepoHashSet(hashDbInfo.getHashSetName(), hashDbInfo.getVersion(),
hashDbInfo.getReferenceSetID(),
hashDbInfo.getSearchDuringIngest(), hashDbInfo.getSendIngestMessages(),
hashDbInfo.getKnownFilesType(), hashDbInfo.isReadOnly());
}
}
} catch (TskCoreException ex) {
Logger.getLogger(HashDbManager.class.getName()).log(Level.SEVERE, "Error opening hash set", ex); //NON-NLS
JOptionPane.showMessageDialog(WindowManager.getDefault().getMainWindow(),
NbBundle.getMessage(this.getClass(),
"HashDbManager.unableToOpenHashDbMsg", hashDbInfo.getHashSetName()),
NbBundle.getMessage(this.getClass(), "HashDbManager.openHashDbErr"),
JOptionPane.ERROR_MESSAGE);
allDatabasesLoadedCorrectly = false;
}
}
private void updateHashSetsFromCentralRepository() throws TskCoreException { private void updateHashSetsFromCentralRepository() throws TskCoreException {
if (CentralRepository.isEnabled()) { if (CentralRepository.isEnabled()) {
List<HashDbInfo> crHashDbInfoList = getCentralRepoHashSetsFromDatabase(); List<HashDbInfo> crHashDbInfoList = getCentralRepoHashSetsFromDatabase();
@ -702,17 +900,21 @@ public class HashDbManager implements PropertyChangeListener {
}) })
public enum KnownFilesType { public enum KnownFilesType {
KNOWN(Bundle.HashDbManager_known_text(), TskData.FileKnown.KNOWN, false, false), KNOWN(Bundle.HashDbManager_known_text(), "Known", TskData.FileKnown.KNOWN, false, false),
KNOWN_BAD(Bundle.HashDbManager_knownBad_text(), TskData.FileKnown.BAD, true, true), KNOWN_BAD(Bundle.HashDbManager_knownBad_text(), "Notable", TskData.FileKnown.BAD, true, true),
NO_CHANGE(Bundle.HashDbManager_noChange_text(), TskData.FileKnown.UNKNOWN, true, false); NO_CHANGE(Bundle.HashDbManager_noChange_text(), "NoChange", TskData.FileKnown.UNKNOWN, true, false);
private final String displayName; private final String displayName;
private final String identifier;
private final TskData.FileKnown fileKnown; private final TskData.FileKnown fileKnown;
private final boolean allowSendInboxMessages; private final boolean allowSendInboxMessages;
private final boolean defaultSendInboxMessages; private final boolean defaultSendInboxMessages;
KnownFilesType(String displayName, TskData.FileKnown fileKnown, boolean allowSendInboxMessages, boolean defaultSendInboxMessages) { KnownFilesType(String displayName, String identifier, TskData.FileKnown fileKnown,
boolean allowSendInboxMessages, boolean defaultSendInboxMessages) {
this.displayName = displayName; this.displayName = displayName;
this.identifier = identifier;
this.fileKnown = fileKnown; this.fileKnown = fileKnown;
this.allowSendInboxMessages = allowSendInboxMessages; this.allowSendInboxMessages = allowSendInboxMessages;
this.defaultSendInboxMessages = defaultSendInboxMessages; this.defaultSendInboxMessages = defaultSendInboxMessages;
@ -740,6 +942,16 @@ public class HashDbManager implements PropertyChangeListener {
return defaultSendInboxMessages; return defaultSendInboxMessages;
} }
/**
* Returns the identifier for this KnownFilesType. This is used for
* Official Hash Sets in their naming convention.
*
* @return The identifier for this type.
*/
String getIdentifier() {
return identifier;
}
public String getDisplayName() { public String getDisplayName() {
return this.displayName; return this.displayName;
} }
@ -863,14 +1075,20 @@ public class HashDbManager implements PropertyChangeListener {
private final HashDb.KnownFilesType knownFilesType; private final HashDb.KnownFilesType knownFilesType;
private boolean indexing; private boolean indexing;
private final PropertyChangeSupport propertyChangeSupport = new PropertyChangeSupport(this); private final PropertyChangeSupport propertyChangeSupport = new PropertyChangeSupport(this);
private final boolean officialSet;
private SleuthkitHashSet(int handle, String hashSetName, boolean useForIngest, boolean sendHitMessages, KnownFilesType knownFilesType) { private SleuthkitHashSet(int handle, String hashSetName, boolean useForIngest, boolean sendHitMessages, KnownFilesType knownFilesType) {
this(handle, hashSetName, useForIngest, sendHitMessages, knownFilesType, false);
}
private SleuthkitHashSet(int handle, String hashSetName, boolean useForIngest, boolean sendHitMessages, KnownFilesType knownFilesType, boolean officialSet) {
this.handle = handle; this.handle = handle;
this.hashSetName = hashSetName; this.hashSetName = hashSetName;
this.searchDuringIngest = useForIngest; this.searchDuringIngest = useForIngest;
this.sendIngestMessages = sendHitMessages; this.sendIngestMessages = sendHitMessages;
this.knownFilesType = knownFilesType; this.knownFilesType = knownFilesType;
this.indexing = false; this.indexing = false;
this.officialSet = officialSet;
} }
/** /**
@ -956,6 +1174,10 @@ public class HashDbManager implements PropertyChangeListener {
*/ */
@Override @Override
public boolean isUpdateable() throws TskCoreException { public boolean isUpdateable() throws TskCoreException {
if (isOfficialSet()) {
return false;
}
return SleuthkitJNI.isUpdateableHashDatabase(this.handle); return SleuthkitJNI.isUpdateableHashDatabase(this.handle);
} }
@ -986,6 +1208,7 @@ public class HashDbManager implements PropertyChangeListener {
public void addHashes(Content content, String comment) throws TskCoreException { public void addHashes(Content content, String comment) throws TskCoreException {
// This only works for AbstractFiles and MD5 hashes at present. // This only works for AbstractFiles and MD5 hashes at present.
assert content instanceof AbstractFile; assert content instanceof AbstractFile;
officialSetCheck();
if (content instanceof AbstractFile) { if (content instanceof AbstractFile) {
AbstractFile file = (AbstractFile) content; AbstractFile file = (AbstractFile) content;
if (null != file.getMd5Hash()) { if (null != file.getMd5Hash()) {
@ -994,6 +1217,17 @@ public class HashDbManager implements PropertyChangeListener {
} }
} }
/**
* Throws an exception if the current set is an official set.
*
* @throws TskCoreException
*/
private void officialSetCheck() throws TskCoreException {
if (isOfficialSet()) {
throw new TskCoreException("Hashes cannot be added to an official set");
}
}
/** /**
* Adds a list of hashes to the hash database at once * Adds a list of hashes to the hash database at once
* *
@ -1003,6 +1237,7 @@ public class HashDbManager implements PropertyChangeListener {
*/ */
@Override @Override
public void addHashes(List<HashEntry> hashes) throws TskCoreException { public void addHashes(List<HashEntry> hashes) throws TskCoreException {
officialSetCheck();
SleuthkitJNI.addToHashDatabase(hashes, handle); SleuthkitJNI.addToHashDatabase(hashes, handle);
} }
@ -1122,6 +1357,16 @@ public class HashDbManager implements PropertyChangeListener {
} }
return true; return true;
} }
/**
* Returns whether or not the set is an official set. If the set is an
* official set, it is treated as readonly and cannot be deleted.
*
* @return Whether or not the set is an official set.
*/
boolean isOfficialSet() {
return officialSet;
}
} }
/** /**

View File

@ -373,6 +373,35 @@ final class HashLookupSettings implements Serializable {
private final int referenceSetID; private final int referenceSetID;
private DatabaseType dbType; private DatabaseType dbType;
/**
* Constructs a HashDbInfo object.
*
* @param hashSetName The name of the hash set
* @param knownFilesType The known files type
* @param searchDuringIngest Whether or not the db is searched during
* ingest
* @param sendIngestMessages Whether or not ingest messages are sent
* @param path The path to the db
* @param referenceSetID The reference set ID.
* @param version The version for the hashset.
* @param readOnly Whether or not the set is readonly.
* @param isCRType A central repo db type (otherwise, file type).
*/
HashDbInfo(String hashSetName, HashDbManager.HashDb.KnownFilesType knownFilesType, boolean searchDuringIngest, boolean sendIngestMessages,
String path, int referenceSetID, String version, boolean readOnly, boolean isCRType) {
this.hashSetName = hashSetName;
this.knownFilesType = knownFilesType;
this.searchDuringIngest = searchDuringIngest;
this.sendIngestMessages = sendIngestMessages;
this.path = path;
this.referenceSetID = referenceSetID;
this.version = version;
this.readOnly = readOnly;
this.dbType = isCRType ? DatabaseType.CENTRAL_REPOSITORY : DatabaseType.FILE;
}
/** /**
* Constructs a HashDbInfo object for files type * Constructs a HashDbInfo object for files type
* *

View File

@ -25,8 +25,10 @@ import java.awt.Frame;
import java.awt.event.KeyEvent; import java.awt.event.KeyEvent;
import java.beans.PropertyChangeEvent; import java.beans.PropertyChangeEvent;
import java.beans.PropertyChangeListener; import java.beans.PropertyChangeListener;
import java.io.File;
import java.util.ArrayList; import java.util.ArrayList;
import java.util.List; import java.util.List;
import java.util.MissingResourceException;
import java.util.logging.Level; import java.util.logging.Level;
import javax.swing.JComponent; import javax.swing.JComponent;
import javax.swing.JOptionPane; import javax.swing.JOptionPane;
@ -37,6 +39,7 @@ import javax.swing.event.ListSelectionEvent;
import javax.swing.event.ListSelectionListener; import javax.swing.event.ListSelectionListener;
import javax.swing.table.AbstractTableModel; import javax.swing.table.AbstractTableModel;
import javax.swing.table.TableCellRenderer; import javax.swing.table.TableCellRenderer;
import org.apache.commons.lang3.StringUtils;
import org.netbeans.spi.options.OptionsPanelController; import org.netbeans.spi.options.OptionsPanelController;
import org.openide.util.NbBundle; import org.openide.util.NbBundle;
import org.openide.util.NbBundle.Messages; import org.openide.util.NbBundle.Messages;
@ -197,68 +200,7 @@ public final class HashLookupSettingsPanel extends IngestModuleGlobalSettingsPan
if (db instanceof SleuthkitHashSet) { if (db instanceof SleuthkitHashSet) {
SleuthkitHashSet hashDb = (SleuthkitHashSet) db; SleuthkitHashSet hashDb = (SleuthkitHashSet) db;
updateForSleuthkitHashSet(ingestIsRunning, hashDb);
// Disable the central repo fields
hashDbVersionLabel.setText(Bundle.HashLookupSettingsPanel_notApplicable());
hashDbOrgLabel.setText(Bundle.HashLookupSettingsPanel_notApplicable());
// Enable the delete button if ingest is not running
deleteDatabaseButton.setEnabled(!ingestIsRunning);
try {
hashDbLocationLabel.setText(db.getDatabasePath());
} catch (TskCoreException ex) {
Logger.getLogger(HashLookupSettingsPanel.class.getName()).log(Level.SEVERE, "Error getting hash set path of " + db.getHashSetName() + " hash set", ex); //NON-NLS
hashDbLocationLabel.setText(ERROR_GETTING_PATH_TEXT);
}
try {
indexPathLabel.setText(hashDb.getIndexPath());
} catch (TskCoreException ex) {
Logger.getLogger(HashLookupSettingsPanel.class.getName()).log(Level.SEVERE, "Error getting index path of " + db.getHashSetName() + " hash set", ex); //NON-NLS
indexPathLabel.setText(ERROR_GETTING_PATH_TEXT);
}
// Update indexing components.
try {
if (hashDb.isIndexing()) {
indexButton.setText(
NbBundle.getMessage(this.getClass(), "HashDbConfigPanel.indexButtonText.indexing"));
hashDbIndexStatusLabel.setText(
NbBundle.getMessage(this.getClass(), "HashDbConfigPanel.indexStatusText.indexGen"));
hashDbIndexStatusLabel.setForeground(Color.black);
indexButton.setEnabled(false);
} else if (hashDb.hasIndex()) {
if (hashDb.hasIndexOnly()) {
hashDbIndexStatusLabel.setText(
NbBundle.getMessage(this.getClass(), "HashDbConfigPanel.indexStatusText.indexOnly"));
} else {
hashDbIndexStatusLabel.setText(
NbBundle.getMessage(this.getClass(), "HashDbConfigPanel.indexStatusText.indexed"));
}
hashDbIndexStatusLabel.setForeground(Color.black);
if (hashDb.canBeReIndexed()) {
indexButton.setText(
NbBundle.getMessage(this.getClass(), "HashDbConfigPanel.indexButtonText.reIndex"));
indexButton.setEnabled(true);
} else {
indexButton.setText(NbBundle.getMessage(this.getClass(), "HashDbConfigPanel.indexButtonText.index"));
indexButton.setEnabled(false);
}
} else {
hashDbIndexStatusLabel.setText(
NbBundle.getMessage(this.getClass(), "HashDbConfigPanel.indexStatusText.noIndex"));
hashDbIndexStatusLabel.setForeground(Color.red);
indexButton.setText(NbBundle.getMessage(this.getClass(), "HashDbConfigPanel.indexButtonText.index"));
indexButton.setEnabled(true);
}
} catch (TskCoreException ex) {
Logger.getLogger(HashLookupSettingsPanel.class.getName()).log(Level.SEVERE, "Error getting index state of hash set", ex); //NON-NLS
hashDbIndexStatusLabel.setText(ERROR_GETTING_INDEX_STATUS_TEXT);
hashDbIndexStatusLabel.setForeground(Color.red);
indexButton.setText(NbBundle.getMessage(this.getClass(), "HashDbConfigPanel.indexButtonText.index"));
indexButton.setEnabled(false);
}
} else { } else {
// Disable the file type fields/buttons // Disable the file type fields/buttons
@ -291,6 +233,93 @@ public final class HashLookupSettingsPanel extends IngestModuleGlobalSettingsPan
ingestWarningLabel.setVisible(ingestIsRunning); ingestWarningLabel.setVisible(ingestIsRunning);
} }
private static String getPathString(String path, boolean justFilename) {
if (StringUtils.isBlank(path)) {
return "";
}
if (!justFilename) {
return path;
}
return new File(path).getName();
}
/**
* Updates UI for a SleuthkitHashSet.
*
* @param ingestIsRunning Whether or not ingest is running.
* @param hashDb The hash set to be included in the list of hash
* sets.
*
* @throws MissingResourceException
*/
private void updateForSleuthkitHashSet(boolean ingestIsRunning, SleuthkitHashSet hashDb) throws MissingResourceException {
// Disable the central repo fields
hashDbVersionLabel.setText(Bundle.HashLookupSettingsPanel_notApplicable());
hashDbOrgLabel.setText(Bundle.HashLookupSettingsPanel_notApplicable());
// Enable the delete button if ingest is not running and is not an official hashset
deleteDatabaseButton.setEnabled(!ingestIsRunning && !hashDb.isOfficialSet());
try {
String dbPath = getPathString(hashDb.getDatabasePath(), hashDb.isOfficialSet());
hashDbLocationLabel.setText(dbPath);
} catch (TskCoreException ex) {
Logger.getLogger(HashLookupSettingsPanel.class.getName()).log(Level.SEVERE, "Error getting hash set path of " + hashDb.getHashSetName() + " hash set", ex); //NON-NLS
hashDbLocationLabel.setText(ERROR_GETTING_PATH_TEXT);
}
try {
String indexPath = getPathString(hashDb.getIndexPath(), hashDb.isOfficialSet());
indexPathLabel.setText(indexPath);
} catch (TskCoreException ex) {
Logger.getLogger(HashLookupSettingsPanel.class.getName()).log(Level.SEVERE, "Error getting index path of " + hashDb.getHashSetName() + " hash set", ex); //NON-NLS
indexPathLabel.setText(ERROR_GETTING_PATH_TEXT);
}
// Update indexing components.
try {
if (hashDb.isIndexing()) {
indexButton.setText(
NbBundle.getMessage(this.getClass(), "HashDbConfigPanel.indexButtonText.indexing"));
hashDbIndexStatusLabel.setText(
NbBundle.getMessage(this.getClass(), "HashDbConfigPanel.indexStatusText.indexGen"));
hashDbIndexStatusLabel.setForeground(Color.black);
indexButton.setEnabled(false);
} else if (hashDb.hasIndex()) {
if (hashDb.hasIndexOnly()) {
hashDbIndexStatusLabel.setText(
NbBundle.getMessage(this.getClass(), "HashDbConfigPanel.indexStatusText.indexOnly"));
} else {
hashDbIndexStatusLabel.setText(
NbBundle.getMessage(this.getClass(), "HashDbConfigPanel.indexStatusText.indexed"));
}
hashDbIndexStatusLabel.setForeground(Color.black);
if (hashDb.canBeReIndexed()) {
indexButton.setText(
NbBundle.getMessage(this.getClass(), "HashDbConfigPanel.indexButtonText.reIndex"));
indexButton.setEnabled(true);
} else {
indexButton.setText(NbBundle.getMessage(this.getClass(), "HashDbConfigPanel.indexButtonText.index"));
indexButton.setEnabled(false);
}
} else {
hashDbIndexStatusLabel.setText(
NbBundle.getMessage(this.getClass(), "HashDbConfigPanel.indexStatusText.noIndex"));
hashDbIndexStatusLabel.setForeground(Color.red);
indexButton.setText(NbBundle.getMessage(this.getClass(), "HashDbConfigPanel.indexButtonText.index"));
indexButton.setEnabled(true);
}
} catch (TskCoreException ex) {
Logger.getLogger(HashLookupSettingsPanel.class.getName()).log(Level.SEVERE, "Error getting index state of hash set", ex); //NON-NLS
hashDbIndexStatusLabel.setText(ERROR_GETTING_INDEX_STATUS_TEXT);
hashDbIndexStatusLabel.setForeground(Color.red);
indexButton.setText(NbBundle.getMessage(this.getClass(), "HashDbConfigPanel.indexButtonText.index"));
indexButton.setEnabled(false);
}
}
private boolean isLocalIngestJobEvent(PropertyChangeEvent evt) { private boolean isLocalIngestJobEvent(PropertyChangeEvent evt) {
if (evt instanceof AutopsyEvent) { if (evt instanceof AutopsyEvent) {
AutopsyEvent event = (AutopsyEvent) evt; AutopsyEvent event = (AutopsyEvent) evt;

View File

@ -12,7 +12,7 @@ FilesSetDefsPanel.exportButtonActionPerformed.fileExistPrompt=File {0} exists, o
FilesSetDefsPanel.gigaBytes=Gigabytes FilesSetDefsPanel.gigaBytes=Gigabytes
# {0} - fileName # {0} - fileName
# {1} - errorMessage # {1} - errorMessage
FilesSetDefsPanel.importSetButtonActionPerformed.importError=The rules file "{0}" could not be read:\n"{1}." FilesSetDefsPanel.importSetButtonActionPerformed.importError=The rules file "{0}" could not be read:\n{1}.
FilesSetDefsPanel.importSetButtonActionPerformed.noFiles=No files sets were found in the selected files. FilesSetDefsPanel.importSetButtonActionPerformed.noFiles=No files sets were found in the selected files.
FilesSetDefsPanel.importSetButtonActionPerformed.noFilesSelected=No files sets were selected. FilesSetDefsPanel.importSetButtonActionPerformed.noFilesSelected=No files sets were selected.
# {0} - filter name # {0} - filter name
@ -24,6 +24,7 @@ FilesSetDefsPanel.interesting.exportButtonAction.featureName=Interesting Files S
FilesSetDefsPanel.interesting.ExportedMsg=Interesting files set exported FilesSetDefsPanel.interesting.ExportedMsg=Interesting files set exported
FilesSetDefsPanel.interesting.exportSetButton.text=Export Set FilesSetDefsPanel.interesting.exportSetButton.text=Export Set
FilesSetDefsPanel.interesting.failExportMsg=Export of interesting files set failed FilesSetDefsPanel.interesting.failExportMsg=Export of interesting files set failed
FilesSetDefsPanel.interesting.failImportMsg=Interesting files set not imported
FilesSetDefsPanel.interesting.fileExtensionFilterLbl=Autopsy Interesting File Set File (xml) FilesSetDefsPanel.interesting.fileExtensionFilterLbl=Autopsy Interesting File Set File (xml)
FilesSetDefsPanel.interesting.importButtonAction.featureName=Interesting Files Set Import FilesSetDefsPanel.interesting.importButtonAction.featureName=Interesting Files Set Import
FilesSetDefsPanel.interesting.importOwConflict=Import Interesting files set conflict FilesSetDefsPanel.interesting.importOwConflict=Import Interesting files set conflict
@ -33,7 +34,7 @@ FilesSetDefsPanel.interesting.newOwConflict=Interesting files set conflict
FilesSetDefsPanel.interesting.overwriteSetPrompt=Interesting files set "{0}" already exists locally, overwrite? FilesSetDefsPanel.interesting.overwriteSetPrompt=Interesting files set "{0}" already exists locally, overwrite?
# {0} - FilesSet name # {0} - FilesSet name
# {1} - New FilesSet name # {1} - New FilesSet name
FilesSetDefsPanel.interesting.standardFileConflict=A standard interesting file set already exists with the name "{0}." Would you like to rename the set to "{1}?" FilesSetDefsPanel.interesting.standardFileConflict=A standard interesting file set already exists with the name "{0}." Would you like to rename your set to "{1}?"
FilesSetDefsPanel.Interesting.Title=Global Interesting Items Settings FilesSetDefsPanel.Interesting.Title=Global Interesting Items Settings
FilesSetDefsPanel.kiloBytes=Kilobytes FilesSetDefsPanel.kiloBytes=Kilobytes
FilesSetDefsPanel.loadError=Error loading interesting files sets from file. FilesSetDefsPanel.loadError=Error loading interesting files sets from file.
@ -64,7 +65,7 @@ FilesSetRulePanel.ZeroFileSizeError=File size condition value must not be 0 (Unl
FilesSetsManager.allFilesAndDirectories=All Files and Directories (Not Unallocated Space) FilesSetsManager.allFilesAndDirectories=All Files and Directories (Not Unallocated Space)
FilesSetsManager.allFilesDirectoriesAndUnallocated=All Files, Directories, and Unallocated Space FilesSetsManager.allFilesDirectoriesAndUnallocated=All Files, Directories, and Unallocated Space
# {0} - regex # {0} - regex
InterestingItemsFilesSetSettings.readDateCondition.failedCompiledRegex=Error detmining ''{0}'' number InterestingItemsFilesSetSettings.readDateCondition.failedCompiledRegex=Error determining ''{0}'' number
# {0} - condition # {0} - condition
# {1} - rule # {1} - rule
InterestingItemsFilesSetSettings.readMetaTypeCondition.malformedXml=Files set is malformed for metatype condition, ''{0}'', in rule ''{1}'' InterestingItemsFilesSetSettings.readMetaTypeCondition.malformedXml=Files set is malformed for metatype condition, ''{0}'', in rule ''{1}''
@ -180,3 +181,6 @@ FilesSetDefsPanel.mimeTypeLabel.text=MIME Type:
FilesSetDefsPanel.fileSizeLabel.text=File Size: FilesSetDefsPanel.fileSizeLabel.text=File Size:
# {0} - filesSetName # {0} - filesSetName
StandardInterestingFileSetsLoader.customSuffixed={0} (Custom) StandardInterestingFileSetsLoader.customSuffixed={0} (Custom)
StandardInterestingFilesSetsLoader_cannotLoadStandard=Unable to properly read standard interesting files sets.
StandardInterestingFilesSetsLoader_cannotLoadUserConfigured=Unable to properly read user-configured interesting files sets.
StandardInterestingFilesSetsLoader_cannotUpdateInterestingFilesSets=Unable to write updated configuration for interesting files sets to config directory.

View File

@ -19,7 +19,7 @@ ReportProgressIndicator.switchToIndeterminateMessage=Report generation progress
ReportWizardDataSourceSelectionPanel.confirmEmptySelection=Are you sure you want to proceed with no selections? ReportWizardDataSourceSelectionPanel.confirmEmptySelection=Are you sure you want to proceed with no selections?
ReportWizardDataSourceSelectionPanel.finishButton.text=Finish ReportWizardDataSourceSelectionPanel.finishButton.text=Finish
ReportWizardDataSourceSelectionPanel.nextButton.text=Next ReportWizardDataSourceSelectionPanel.nextButton.text=Next
ReportWizardDataSourceSelectionPanel.title=Select which datasource(s) to include ReportWizardDataSourceSelectionPanel.title=Select which data source(s) to include
ReportWizardFileOptionsVisualPanel.jLabel1.text=Select items to include in File Report: ReportWizardFileOptionsVisualPanel.jLabel1.text=Select items to include in File Report:
ReportWizardFileOptionsVisualPanel.deselectAllButton.text=Deselect All ReportWizardFileOptionsVisualPanel.deselectAllButton.text=Deselect All
ReportWizardFileOptionsVisualPanel.selectAllButton.text=Select All ReportWizardFileOptionsVisualPanel.selectAllButton.text=Select All

View File

@ -87,7 +87,7 @@ public class ReportWizardDataSourceSelectionPanel implements WizardDescriptor.Fi
} }
@NbBundle.Messages({ @NbBundle.Messages({
"ReportWizardDataSourceSelectionPanel.title=Select which datasource(s) to include" "ReportWizardDataSourceSelectionPanel.title=Select which data source(s) to include"
}) })
@Override @Override
public CheckBoxListPanel<Long> getComponent() { public CheckBoxListPanel<Long> getComponent() {

View File

@ -1,466 +0,0 @@
/*
* Autopsy Forensic Browser
*
* Copyright 2018-2020 Basis Technology Corp.
* Contact: carrier <at> sleuthkit <dot> org
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
package org.sleuthkit.autopsy.report.modules.caseuco;
import com.fasterxml.jackson.annotation.JsonAnyGetter;
import com.fasterxml.jackson.annotation.JsonInclude;
import com.fasterxml.jackson.annotation.JsonProperty;
import java.io.IOException;
import java.nio.file.Path;
import java.util.SimpleTimeZone;
import java.util.TimeZone;
import org.sleuthkit.autopsy.casemodule.Case;
import org.sleuthkit.autopsy.casemodule.Case.CaseType;
import org.sleuthkit.autopsy.datamodel.ContentUtils;
import org.sleuthkit.datamodel.AbstractFile;
import org.sleuthkit.datamodel.SleuthkitCase;
import com.fasterxml.jackson.core.JsonEncoding;
import com.fasterxml.jackson.core.JsonFactory;
import com.fasterxml.jackson.core.JsonGenerator;
import com.fasterxml.jackson.core.util.DefaultIndenter;
import com.fasterxml.jackson.core.util.DefaultPrettyPrinter;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.google.common.base.Strings;
import java.util.ArrayList;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import org.sleuthkit.datamodel.Content;
import org.sleuthkit.datamodel.Image;
import org.sleuthkit.datamodel.TskCoreException;
/**
* Writes Autopsy DataModel objects to Case UCO format.
*
* Clients are expected to add the Case first. Then they should add each data
* source before adding any files for that data source.
*
* Here is an example, where we add everything:
*
* Path directory = Paths.get("C:", "Reports");
* CaseUcoReportGenerator caseUco = new CaseUcoReportGenerator(directory, "my-report");
*
* Case caseObj = Case.getCurrentCase();
* caseUco.addCase(caseObj);
* List<Content> dataSources = caseObj.getDataSources();
* for(Content dataSource : dataSources) {
* caseUco.addDataSource(dataSource, caseObj);
* List<AbstractFile> files = getAllFilesInDataSource(dataSource);
* for(AbstractFile file : files) {
* caseUco.addFile(file, dataSource);
* }
* }
*
* Path reportOutput = caseUco.generateReport();
* //Done. Report at - "C:\Reports\my-report.json-ld"
*
* Please note that the life cycle for this class ends with generateReport().
* The underlying file handle to 'my-report.json-ld' will be closed. Any further
* calls to addX() will result in an IOException.
*/
public final class CaseUcoReportGenerator {
private static final String EXTENSION = "json-ld";
private final TimeZone timeZone;
private final Path reportPath;
private final JsonGenerator reportGenerator;
/**
* Creates a CaseUCO Report Generator that writes a report in the specified
* directory.
*
* TimeZone is assumed to be GMT+0 for formatting file creation time,
* accessed time and modified time.
*
* @param directory Directory to write the CaseUCO report file. Assumes the
* calling thread has write access to the directory and that the directory
* exists.
* @param reportName Name of the CaseUCO report file.
* @throws IOException If an I/O error occurs
*/
public CaseUcoReportGenerator(Path directory, String reportName) throws IOException {
this.reportPath = directory.resolve(reportName + "." + EXTENSION);
JsonFactory jsonGeneratorFactory = new JsonFactory();
reportGenerator = jsonGeneratorFactory.createGenerator(reportPath.toFile(), JsonEncoding.UTF8);
// Puts a newline between each Key, Value pair for readability.
reportGenerator.setPrettyPrinter(new DefaultPrettyPrinter()
.withObjectIndenter(new DefaultIndenter(" ", "\n")));
ObjectMapper mapper = new ObjectMapper();
mapper.setSerializationInclusion(JsonInclude.Include.NON_NULL);
mapper.setSerializationInclusion(JsonInclude.Include.NON_EMPTY);
reportGenerator.setCodec(mapper);
reportGenerator.writeStartObject();
reportGenerator.writeFieldName("@graph");
reportGenerator.writeStartArray();
//Assume GMT+0
this.timeZone = new SimpleTimeZone(0, "GMT");
}
/**
* Adds an AbstractFile instance to the Case UCO report.
*
* @param file AbstractFile instance to write
* @param parentDataSource The parent data source for this abstract file. It
* is assumed that this parent has been written to the report (via
* addDataSource) prior to this call. Otherwise, the report may be invalid.
* @throws IOException If an I/O error occurs.
* @throws TskCoreException
*/
public void addFile(AbstractFile file, Content parentDataSource) throws IOException, TskCoreException {
addFile(file, parentDataSource, null);
}
/**
* Adds an AbstractFile instance to the Case UCO report.
*
* @param file AbstractFile instance to write
* @param parentDataSource The parent data source for this abstract file. It
* is assumed that this parent has been written to the report (via
* addDataSource) prior to this call. Otherwise, the report may be invalid.
* @param localPath The location of the file on secondary storage, somewhere
* other than the case. Example: local disk. This value will be ignored if
* it is null.
* @throws IOException
* @throws TskCoreException
*/
public void addFile(AbstractFile file, Content parentDataSource, Path localPath) throws IOException, TskCoreException {
String fileTraceId = getFileTraceId(file);
//Create the Trace CASE node, which will contain attributes about some evidence.
//Trace is the standard term for evidence. For us, this means file system files.
CASENode fileTrace = new CASENode(fileTraceId, "Trace");
//The bits of evidence for each Trace node are contained within Property
//Bundles. There are a number of Property Bundles available in the CASE ontology.
//Build up the File Property Bundle, as the name implies - properties of
//the file itself.
CASEPropertyBundle filePropertyBundle = createFileBundle(file);
fileTrace.addBundle(filePropertyBundle);
//Build up the ContentData Property Bundle, as the name implies - properties of
//the File data itself.
CASEPropertyBundle contentDataPropertyBundle = createContentDataBundle(file);
fileTrace.addBundle(contentDataPropertyBundle);
if(localPath != null) {
String urlTraceId = getURLTraceId(file);
CASENode urlTrace = new CASENode(urlTraceId, "Trace");
CASEPropertyBundle urlPropertyBundle = new CASEPropertyBundle("URL");
urlPropertyBundle.addProperty("fullValue", localPath.toString());
urlTrace.addBundle(urlPropertyBundle);
contentDataPropertyBundle.addProperty("dataPayloadReferenceUrl", urlTraceId);
reportGenerator.writeObject(urlTrace);
}
//Create the Relationship CASE node. This defines how the Trace CASE node described above
//is related to another CASE node (in this case, the parent data source).
String relationshipID = getRelationshipId(file);
CASENode relationship = createRelationshipNode(relationshipID,
fileTraceId, getDataSourceTraceId(parentDataSource));
//Build up the PathRelation bundle for the relationship node,
//as the name implies - the Path of the Trace in the data source.
CASEPropertyBundle pathRelationPropertyBundle = new CASEPropertyBundle("PathRelation");
pathRelationPropertyBundle.addProperty("path", file.getUniquePath());
relationship.addBundle(pathRelationPropertyBundle);
//This completes the triage, write them to JSON.
reportGenerator.writeObject(fileTrace);
reportGenerator.writeObject(relationship);
}
private String getURLTraceId(Content content) {
return "url-" + content.getId();
}
/**
* All relationship nodes will be the same within our context. Namely, contained-within
* and isDirectional as true.
*/
private CASENode createRelationshipNode(String relationshipID, String sourceID, String targetID) {
CASENode relationship = new CASENode(relationshipID, "Relationship");
relationship.addProperty("source", sourceID);
relationship.addProperty("target", targetID);
relationship.addProperty("kindOfRelationship", "contained-within");
relationship.addProperty("isDirectional", true);
return relationship;
}
/**
* Creates a File Property Bundle with a selection of file attributes.
*/
private CASEPropertyBundle createFileBundle(AbstractFile file) throws TskCoreException {
CASEPropertyBundle filePropertyBundle = new CASEPropertyBundle("File");
String createdTime = ContentUtils.getStringTimeISO8601(file.getCrtime(), timeZone);
String accessedTime = ContentUtils.getStringTimeISO8601(file.getAtime(), timeZone);
String modifiedTime = ContentUtils.getStringTimeISO8601(file.getMtime(), timeZone);
filePropertyBundle.addProperty("createdTime", createdTime);
filePropertyBundle.addProperty("accessedTime", accessedTime);
filePropertyBundle.addProperty("modifiedTime", modifiedTime);
if (!Strings.isNullOrEmpty(file.getNameExtension())) {
filePropertyBundle.addProperty("extension", file.getNameExtension());
}
filePropertyBundle.addProperty("fileName", file.getName());
filePropertyBundle.addProperty("filePath", file.getUniquePath());
filePropertyBundle.addProperty("isDirectory", file.isDir());
filePropertyBundle.addProperty("sizeInBytes", Long.toString(file.getSize()));
return filePropertyBundle;
}
/**
* Creates a Content Data Property Bundle with a selection of file attributes.
*/
private CASEPropertyBundle createContentDataBundle(AbstractFile file) {
CASEPropertyBundle contentDataPropertyBundle = new CASEPropertyBundle("ContentData");
if (!Strings.isNullOrEmpty(file.getMIMEType())) {
contentDataPropertyBundle.addProperty("mimeType", file.getMIMEType());
}
if (!Strings.isNullOrEmpty(file.getMd5Hash())) {
List<CASEPropertyBundle> hashPropertyBundles = new ArrayList<>();
CASEPropertyBundle md5HashPropertyBundle = new CASEPropertyBundle("Hash");
md5HashPropertyBundle.addProperty("hashMethod", "MD5");
md5HashPropertyBundle.addProperty("hashValue", file.getMd5Hash());
hashPropertyBundles.add(md5HashPropertyBundle);
contentDataPropertyBundle.addProperty("hash", hashPropertyBundles);
}
contentDataPropertyBundle.addProperty("sizeInBytes", Long.toString(file.getSize()));
return contentDataPropertyBundle;
}
/**
* Creates a unique CASE Node file trace id.
*/
private String getFileTraceId(AbstractFile file) {
return "file-" + file.getId();
}
/**
* Creates a unique CASE Node relationship id value.
*/
private String getRelationshipId(Content content) {
return "relationship-" + content.getId();
}
/**
* Adds a Content instance (which is known to be a DataSource) to the CASE
* report. This means writing a selection of attributes to a CASE or UCO
* object.
*
* @param dataSource Datasource content to write
* @param parentCase The parent case that this data source belongs in. It is
* assumed that this parent has been written to the report (via addCase)
* prior to this call. Otherwise, the report may be invalid.
*/
public void addDataSource(Content dataSource, Case parentCase) throws IOException, TskCoreException {
String dataSourceTraceId = this.getDataSourceTraceId(dataSource);
CASENode dataSourceTrace = new CASENode(dataSourceTraceId, "Trace");
CASEPropertyBundle filePropertyBundle = new CASEPropertyBundle("File");
String dataSourcePath = getDataSourcePath(dataSource);
filePropertyBundle.addProperty("filePath", dataSourcePath);
dataSourceTrace.addBundle(filePropertyBundle);
if (dataSource.getSize() > 0) {
CASEPropertyBundle contentDataPropertyBundle = new CASEPropertyBundle("ContentData");
contentDataPropertyBundle.addProperty("sizeInBytes", Long.toString(dataSource.getSize()));
dataSourceTrace.addBundle(contentDataPropertyBundle);
}
// create a "relationship" entry between the case and the data source
String caseTraceId = getCaseTraceId(parentCase);
String relationshipTraceId = getRelationshipId(dataSource);
CASENode relationship = createRelationshipNode(relationshipTraceId,
dataSourceTraceId, caseTraceId);
CASEPropertyBundle pathRelationBundle = new CASEPropertyBundle("PathRelation");
pathRelationBundle.addProperty("path", dataSourcePath);
relationship.addBundle(pathRelationBundle);
//This completes the triage, write them to JSON.
reportGenerator.writeObject(dataSourceTrace);
reportGenerator.writeObject(relationship);
}
private String getDataSourcePath(Content dataSource) {
String dataSourcePath = "";
if (dataSource instanceof Image) {
String[] paths = ((Image) dataSource).getPaths();
if (paths.length > 0) {
//Get the first data source in the path, as this will
//be reflected in each file's uniquePath.
dataSourcePath = paths[0];
}
} else {
dataSourcePath = dataSource.getName();
}
dataSourcePath = dataSourcePath.replaceAll("\\\\", "/");
return dataSourcePath;
}
/**
* Creates a unique Case UCO trace id for a data source.
*
* @param dataSource
* @return
*/
private String getDataSourceTraceId(Content dataSource) {
return "data-source-" + dataSource.getId();
}
/**
* Adds a Case instance to the Case UCO report. This means writing a
* selection of Case attributes to a CASE/UCO object.
*
* @param caseObj Case instance to include in the report.
* @throws IOException If an I/O error is encountered.
*/
public void addCase(Case caseObj) throws IOException {
SleuthkitCase skCase = caseObj.getSleuthkitCase();
String caseDirPath = skCase.getDbDirPath();
String caseTraceId = getCaseTraceId(caseObj);
CASENode caseTrace = new CASENode(caseTraceId, "Trace");
CASEPropertyBundle filePropertyBundle = new CASEPropertyBundle("File");
// replace double slashes with single ones
caseDirPath = caseDirPath.replaceAll("\\\\", "/");
Case.CaseType caseType = caseObj.getCaseType();
if (caseType.equals(CaseType.SINGLE_USER_CASE)) {
filePropertyBundle.addProperty("filePath", caseDirPath + "/" + skCase.getDatabaseName());
filePropertyBundle.addProperty("isDirectory", false);
} else {
filePropertyBundle.addProperty("filePath", caseDirPath);
filePropertyBundle.addProperty("isDirectory", true);
}
caseTrace.addBundle(filePropertyBundle);
reportGenerator.writeObject(caseTrace);
}
/**
* Creates a unique Case UCO trace id for a Case.
*
* @param caseObj
* @return
*/
private String getCaseTraceId(Case caseObj) {
return "case-" + caseObj.getName();
}
/**
* Returns a Path to the completed Case UCO report file.
*
* This marks the end of the CaseUcoReportGenerator's life cycle. This
* function will close an underlying file handles, meaning any subsequent
* calls to addX() will result in an IOException.
*
* @return The Path to the finalized report.
* @throws IOException If an I/O error occurs.
*/
public Path generateReport() throws IOException {
//Finalize the report.
reportGenerator.writeEndArray();
reportGenerator.writeEndObject();
reportGenerator.close();
return reportPath;
}
/**
* A CASE or UCO object. CASE objects can have properties and
* property bundles.
*/
private final class CASENode {
private final String id;
private final String type;
//Dynamic properties added to this CASENode.
private final Map<String, Object> properties;
private final List<CASEPropertyBundle> propertyBundle;
public CASENode(String id, String type) {
this.id = id;
this.type = type;
properties = new LinkedHashMap<>();
propertyBundle = new ArrayList<>();
}
@JsonProperty("@id")
public String getId() {
return id;
}
@JsonProperty("@type")
public String getType() {
return type;
}
@JsonAnyGetter
public Map<String, Object> getProperties() {
return properties;
}
@JsonProperty("propertyBundle")
public List<CASEPropertyBundle> getPropertyBundle() {
return propertyBundle;
}
public void addProperty(String key, Object val) {
properties.put(key, val);
}
public void addBundle(CASEPropertyBundle bundle) {
propertyBundle.add(bundle);
}
}
/**
* Contains CASE or UCO properties.
*/
private final class CASEPropertyBundle {
private final Map<String, Object> properties;
public CASEPropertyBundle(String type) {
properties = new LinkedHashMap<>();
addProperty("@type", type);
}
@JsonAnyGetter
public Map<String, Object> getProperties() {
return properties;
}
public void addProperty(String key, Object val) {
properties.put(key, val);
}
}
}

View File

@ -19,7 +19,15 @@
*/ */
package org.sleuthkit.autopsy.report.modules.caseuco; package org.sleuthkit.autopsy.report.modules.caseuco;
import com.google.gson.Gson;
import com.google.gson.GsonBuilder;
import com.google.gson.JsonElement;
import com.google.gson.stream.JsonWriter;
import java.io.FileOutputStream;
import java.io.IOException; import java.io.IOException;
import java.io.OutputStream;
import java.io.OutputStreamWriter;
import java.nio.file.Files; import java.nio.file.Files;
import java.nio.file.Path; import java.nio.file.Path;
import java.nio.file.Paths; import java.nio.file.Paths;
@ -39,14 +47,20 @@ import org.sleuthkit.autopsy.ingest.IngestManager;
import org.sleuthkit.autopsy.report.GeneralReportModule; import org.sleuthkit.autopsy.report.GeneralReportModule;
import org.sleuthkit.autopsy.report.GeneralReportSettings; import org.sleuthkit.autopsy.report.GeneralReportSettings;
import org.sleuthkit.autopsy.report.ReportProgressPanel; import org.sleuthkit.autopsy.report.ReportProgressPanel;
import org.sleuthkit.caseuco.CaseUcoExporter;
import org.sleuthkit.caseuco.ContentNotExportableException;
import org.sleuthkit.datamodel.AbstractFile; import org.sleuthkit.datamodel.AbstractFile;
import org.sleuthkit.datamodel.BlackboardArtifact;
import org.sleuthkit.datamodel.BlackboardArtifact.ARTIFACT_TYPE;
import org.sleuthkit.datamodel.Content; import org.sleuthkit.datamodel.Content;
import org.sleuthkit.datamodel.DataSource;
import org.sleuthkit.datamodel.TskCoreException; import org.sleuthkit.datamodel.TskCoreException;
import org.sleuthkit.datamodel.TskData; import org.sleuthkit.datamodel.TskData;
import org.sleuthkit.datamodel.blackboardutils.attributes.BlackboardJsonAttrUtil;
/** /**
* CaseUcoReportModule generates a report in CASE-UCO format. This module will * Exports an Autopsy case to a CASE-UCO report file. This module will write all
* write all files and data sources to the report. * files and artifacts from the selected data sources.
*/ */
public final class CaseUcoReportModule implements GeneralReportModule { public final class CaseUcoReportModule implements GeneralReportModule {
@ -54,14 +68,16 @@ public final class CaseUcoReportModule implements GeneralReportModule {
private static final CaseUcoReportModule SINGLE_INSTANCE = new CaseUcoReportModule(); private static final CaseUcoReportModule SINGLE_INSTANCE = new CaseUcoReportModule();
//Supported types of TSK_FS_FILES //Supported types of TSK_FS_FILES
private static final Set<Short> SUPPORTED_TYPES = new HashSet<Short>() {{ private static final Set<Short> SUPPORTED_TYPES = new HashSet<Short>() {
add(TskData.TSK_FS_META_TYPE_ENUM.TSK_FS_META_TYPE_UNDEF.getValue()); {
add(TskData.TSK_FS_META_TYPE_ENUM.TSK_FS_META_TYPE_REG.getValue()); add(TskData.TSK_FS_META_TYPE_ENUM.TSK_FS_META_TYPE_UNDEF.getValue());
add(TskData.TSK_FS_META_TYPE_ENUM.TSK_FS_META_TYPE_VIRT.getValue()); add(TskData.TSK_FS_META_TYPE_ENUM.TSK_FS_META_TYPE_REG.getValue());
}}; add(TskData.TSK_FS_META_TYPE_ENUM.TSK_FS_META_TYPE_VIRT.getValue());
}
};
private static final String REPORT_FILE_NAME = "CASE_UCO_output"; private static final String REPORT_FILE_NAME = "CASE_UCO_output";
private static final String EXTENSION = "json-ld"; private static final String EXTENSION = "jsonld";
// Hidden constructor for the report // Hidden constructor for the report
private CaseUcoReportModule() { private CaseUcoReportModule() {
@ -84,7 +100,7 @@ public final class CaseUcoReportModule implements GeneralReportModule {
@Override @Override
public String getRelativeFilePath() { public String getRelativeFilePath() {
return REPORT_FILE_NAME + "." + EXTENSION; return REPORT_FILE_NAME + "." + EXTENSION;
} }
@Override @Override
@ -109,7 +125,7 @@ public final class CaseUcoReportModule implements GeneralReportModule {
/** /**
* Generates a CASE-UCO format report for all files in the Case. * Generates a CASE-UCO format report for all files in the Case.
* *
* @param settings Report settings. * @param settings Report settings.
* @param progressPanel panel to update the report's progress * @param progressPanel panel to update the report's progress
*/ */
@NbBundle.Messages({ @NbBundle.Messages({
@ -136,46 +152,91 @@ public final class CaseUcoReportModule implements GeneralReportModule {
} catch (IOException ex) { } catch (IOException ex) {
logger.log(Level.WARNING, "Unable to create directory for CASE-UCO report.", ex); logger.log(Level.WARNING, "Unable to create directory for CASE-UCO report.", ex);
progressPanel.complete(ReportProgressPanel.ReportStatus.ERROR, progressPanel.complete(ReportProgressPanel.ReportStatus.ERROR,
Bundle.CaseUcoReportModule_unableToCreateDirectories()); Bundle.CaseUcoReportModule_unableToCreateDirectories());
return; return;
} }
CaseUcoReportGenerator generator = Case currentCase = Case.getCurrentCaseThrows();
new CaseUcoReportGenerator(reportDirectory, REPORT_FILE_NAME);
//First write the Case to the report file. Path caseJsonReportFile = reportDirectory.resolve(REPORT_FILE_NAME + "." + EXTENSION);
Case caseObj = Case.getCurrentCaseThrows();
generator.addCase(caseObj);
List<Content> dataSources = caseObj.getDataSources().stream() try (OutputStream stream = new FileOutputStream(caseJsonReportFile.toFile());
.filter((dataSource) -> { JsonWriter reportWriter = new JsonWriter(new OutputStreamWriter(stream, "UTF-8"))) {
if(settings.getSelectedDataSources() == null) { Gson gson = new GsonBuilder().setPrettyPrinting().create();
// Assume all data sources if list is null. reportWriter.setIndent(" ");
return true; reportWriter.beginObject();
reportWriter.name("@graph");
reportWriter.beginArray();
CaseUcoExporter exporter = new CaseUcoExporter(currentCase.getSleuthkitCase());
for (JsonElement element : exporter.exportSleuthkitCase()) {
gson.toJson(element, reportWriter);
}
// Get a list of selected data sources to process.
List<DataSource> dataSources = getSelectedDataSources(currentCase, settings);
progressPanel.setIndeterminate(false);
progressPanel.setMaximumProgress(dataSources.size());
progressPanel.start();
// First stage of reporting is for files and data sources.
// Iterate through each data source and dump all files contained
// in that data source.
for (int i = 0; i < dataSources.size(); i++) {
DataSource dataSource = dataSources.get(i);
progressPanel.updateStatusLabel(String.format(
Bundle.CaseUcoReportModule_processingDataSource(),
dataSource.getName()));
// Add the data source export.
for (JsonElement element : exporter.exportDataSource(dataSource)) {
gson.toJson(element, reportWriter);
}
// Search all children of the data source.
performDepthFirstSearch(dataSource, gson, exporter, reportWriter);
progressPanel.setProgress(i + 1);
}
// Second stage of reporting handles artifacts.
Set<Long> dataSourceIds = dataSources.stream()
.map((datasource) -> datasource.getId())
.collect(Collectors.toSet());
logger.log(Level.INFO, "Writing all artifacts to the CASE-UCO report. "
+ "Keyword hits will be skipped as they can't be represented"
+ " in CASE format.");
// Write all standard artifacts that are contained within the
// selected data sources.
for (ARTIFACT_TYPE artType : currentCase.getSleuthkitCase().getBlackboardArtifactTypesInUse()) {
if(artType.equals(BlackboardArtifact.ARTIFACT_TYPE.TSK_KEYWORD_HIT)) {
// Keyword hits cannot be represented in CASE.
continue;
}
for (BlackboardArtifact artifact : currentCase.getSleuthkitCase().getBlackboardArtifacts(artType)) {
if (dataSourceIds.contains(artifact.getDataSource().getId())) {
try {
for (JsonElement element : exporter.exportBlackboardArtifact(artifact)) {
gson.toJson(element, reportWriter);
}
} catch (ContentNotExportableException ex) {
logger.log(Level.INFO, String.format("Unable to export blackboard artifact (id: %d, type: %d) to CASE/UCO. "
+ "The artifact type is either not supported or the artifact instance does not have any "
+ "exportable attributes.", artifact.getId(), artType.getTypeID()));
} catch (BlackboardJsonAttrUtil.InvalidJsonException ex) {
logger.log(Level.WARNING, String.format("Artifact instance (id: %d, type: %d) contained a "
+ "malformed json attribute.", artifact.getId(), artType.getTypeID()), ex);
}
} }
return settings.getSelectedDataSources().contains(dataSource.getId()); }
}) }
.collect(Collectors.toList());
progressPanel.setIndeterminate(false); reportWriter.endArray();
progressPanel.setMaximumProgress(dataSources.size()); reportWriter.endObject();
progressPanel.start();
//Then search each data source for file content.
for(int i = 0; i < dataSources.size(); i++) {
Content dataSource = dataSources.get(i);
progressPanel.updateStatusLabel(String.format(
Bundle.CaseUcoReportModule_processingDataSource(),
dataSource.getName()));
//Add the data source and then all children.
generator.addDataSource(dataSource, caseObj);
performDepthFirstSearch(dataSource, generator);
progressPanel.setProgress(i+1);
} }
//Complete the report. currentCase.addReport(caseJsonReportFile.toString(),
Path reportPath = generator.generateReport();
caseObj.addReport(reportPath.toString(),
Bundle.CaseUcoReportModule_srcModuleName(), Bundle.CaseUcoReportModule_srcModuleName(),
REPORT_FILE_NAME); REPORT_FILE_NAME);
progressPanel.complete(ReportProgressPanel.ReportStatus.COMPLETE); progressPanel.complete(ReportProgressPanel.ReportStatus.COMPLETE);
@ -196,6 +257,21 @@ public final class CaseUcoReportModule implements GeneralReportModule {
progressPanel.complete(ReportProgressPanel.ReportStatus.COMPLETE); progressPanel.complete(ReportProgressPanel.ReportStatus.COMPLETE);
} }
/**
* Get the selected data sources from the settings instance.
*/
private List<DataSource> getSelectedDataSources(Case currentCase, GeneralReportSettings settings) throws TskCoreException {
return currentCase.getSleuthkitCase().getDataSources().stream()
.filter((dataSource) -> {
if (settings.getSelectedDataSources() == null) {
// Assume all data sources if list is null.
return true;
}
return settings.getSelectedDataSources().contains(dataSource.getId());
})
.collect(Collectors.toList());
}
/** /**
* Warn the user if ingest is still ongoing. * Warn the user if ingest is still ongoing.
*/ */
@ -207,25 +283,27 @@ public final class CaseUcoReportModule implements GeneralReportModule {
/** /**
* Perform DFS on the data sources tree, which will search it in entirety. * Perform DFS on the data sources tree, which will search it in entirety.
* This traversal is more memory efficient than BFS (Breadth first search).
*/ */
private void performDepthFirstSearch(Content dataSource, private void performDepthFirstSearch(DataSource dataSource,
CaseUcoReportGenerator generator) throws IOException, TskCoreException { Gson gson, CaseUcoExporter exporter, JsonWriter reportWriter) throws IOException, TskCoreException {
Deque<Content> stack = new ArrayDeque<>(); Deque<Content> stack = new ArrayDeque<>();
stack.addAll(dataSource.getChildren()); stack.addAll(dataSource.getChildren());
//Depth First Search the data source tree. //Depth First Search the data source tree.
while(!stack.isEmpty()) { while (!stack.isEmpty()) {
Content current = stack.pop(); Content current = stack.pop();
if(current instanceof AbstractFile) { if (current instanceof AbstractFile) {
AbstractFile f = (AbstractFile) (current); AbstractFile file = (AbstractFile) (current);
if(SUPPORTED_TYPES.contains(f.getMetaType().getValue())) { if (SUPPORTED_TYPES.contains(file.getMetaType().getValue())) {
generator.addFile(f, dataSource);
for (JsonElement element : exporter.exportAbstractFile(file)) {
gson.toJson(element, reportWriter);
}
} }
} }
for(Content child : current.getChildren()) { for (Content child : current.getChildren()) {
stack.push(child); stack.push(child);
} }
} }

View File

@ -20,12 +20,19 @@ package org.sleuthkit.autopsy.report.modules.portablecase;
import com.google.common.collect.ArrayListMultimap; import com.google.common.collect.ArrayListMultimap;
import com.google.common.collect.Multimap; import com.google.common.collect.Multimap;
import com.google.gson.Gson;
import com.google.gson.GsonBuilder;
import com.google.gson.JsonElement;
import com.google.gson.stream.JsonWriter;
import org.sleuthkit.autopsy.report.ReportModule; import org.sleuthkit.autopsy.report.ReportModule;
import java.util.logging.Level; import java.util.logging.Level;
import java.io.BufferedReader; import java.io.BufferedReader;
import java.io.File; import java.io.File;
import java.io.FileOutputStream;
import java.io.InputStreamReader; import java.io.InputStreamReader;
import java.io.IOException; import java.io.IOException;
import java.io.OutputStream;
import java.io.OutputStreamWriter;
import java.nio.file.Files; import java.nio.file.Files;
import java.nio.file.Path; import java.nio.file.Path;
import java.nio.file.Paths; import java.nio.file.Paths;
@ -49,7 +56,7 @@ import org.sleuthkit.autopsy.coreutils.PlatformUtil;
import org.sleuthkit.autopsy.datamodel.ContentUtils; import org.sleuthkit.autopsy.datamodel.ContentUtils;
import org.sleuthkit.autopsy.coreutils.FileTypeUtils.FileTypeCategory; import org.sleuthkit.autopsy.coreutils.FileTypeUtils.FileTypeCategory;
import org.sleuthkit.autopsy.report.ReportProgressPanel; import org.sleuthkit.autopsy.report.ReportProgressPanel;
import org.sleuthkit.autopsy.report.modules.caseuco.CaseUcoReportGenerator; import org.sleuthkit.caseuco.CaseUcoExporter;
import org.sleuthkit.datamodel.AbstractFile; import org.sleuthkit.datamodel.AbstractFile;
import org.sleuthkit.datamodel.BlackboardArtifact; import org.sleuthkit.datamodel.BlackboardArtifact;
import org.sleuthkit.datamodel.BlackboardArtifactTag; import org.sleuthkit.datamodel.BlackboardArtifactTag;
@ -76,6 +83,7 @@ import org.sleuthkit.datamodel.VolumeSystem;
* Creates a portable case from tagged files * Creates a portable case from tagged files
*/ */
public class PortableCaseReportModule implements ReportModule { public class PortableCaseReportModule implements ReportModule {
private static final Logger logger = Logger.getLogger(PortableCaseReportModule.class.getName()); private static final Logger logger = Logger.getLogger(PortableCaseReportModule.class.getName());
private static final String FILE_FOLDER_NAME = "PortableCaseFiles"; // NON-NLS private static final String FILE_FOLDER_NAME = "PortableCaseFiles"; // NON-NLS
private static final String UNKNOWN_FILE_TYPE_FOLDER = "Other"; // NON-NLS private static final String UNKNOWN_FILE_TYPE_FOLDER = "Other"; // NON-NLS
@ -145,7 +153,7 @@ public class PortableCaseReportModule implements ReportModule {
/** /**
* Convenience method for handling cancellation * Convenience method for handling cancellation
* *
* @param progressPanel The report progress panel * @param progressPanel The report progress panel
*/ */
private void handleCancellation(ReportProgressPanel progressPanel) { private void handleCancellation(ReportProgressPanel progressPanel) {
logger.log(Level.INFO, "Portable case creation canceled by user"); // NON-NLS logger.log(Level.INFO, "Portable case creation canceled by user"); // NON-NLS
@ -155,14 +163,14 @@ public class PortableCaseReportModule implements ReportModule {
} }
/** /**
* Convenience method to avoid code duplication. * Convenience method to avoid code duplication. Assumes that if an
* Assumes that if an exception is supplied then the error is SEVERE. Otherwise * exception is supplied then the error is SEVERE. Otherwise it is logged as
* it is logged as a WARNING. * a WARNING.
* *
* @param logWarning Warning to write to the log * @param logWarning Warning to write to the log
* @param dialogWarning Warning to write to a pop-up window * @param dialogWarning Warning to write to a pop-up window
* @param ex The exception (can be null) * @param ex The exception (can be null)
* @param progressPanel The report progress panel * @param progressPanel The report progress panel
*/ */
private void handleError(String logWarning, String dialogWarning, Exception ex, ReportProgressPanel progressPanel) { private void handleError(String logWarning, String dialogWarning, Exception ex, ReportProgressPanel progressPanel) {
if (ex == null) { if (ex == null) {
@ -214,13 +222,13 @@ public class PortableCaseReportModule implements ReportModule {
// Validate the input parameters // Validate the input parameters
File outputDir = new File(reportPath); File outputDir = new File(reportPath);
if (! outputDir.exists()) { if (!outputDir.exists()) {
handleError("Output folder " + outputDir.toString() + " does not exist", handleError("Output folder " + outputDir.toString() + " does not exist",
Bundle.PortableCaseReportModule_generateReport_outputDirDoesNotExist(outputDir.toString()), null, progressPanel); // NON-NLS Bundle.PortableCaseReportModule_generateReport_outputDirDoesNotExist(outputDir.toString()), null, progressPanel); // NON-NLS
return; return;
} }
if (! outputDir.isDirectory()) { if (!outputDir.isDirectory()) {
handleError("Output folder " + outputDir.toString() + " is not a folder", handleError("Output folder " + outputDir.toString() + " is not a folder",
Bundle.PortableCaseReportModule_generateReport_outputDirIsNotDir(outputDir.toString()), null, progressPanel); // NON-NLS Bundle.PortableCaseReportModule_generateReport_outputDirIsNotDir(outputDir.toString()), null, progressPanel); // NON-NLS
return; return;
@ -243,7 +251,7 @@ public class PortableCaseReportModule implements ReportModule {
tagNames = Case.getCurrentCaseThrows().getServices().getTagsManager().getTagNamesInUse(); tagNames = Case.getCurrentCaseThrows().getServices().getTagsManager().getTagNamesInUse();
} catch (NoCurrentCaseException | TskCoreException ex) { } catch (NoCurrentCaseException | TskCoreException ex) {
handleError("Unable to get all tags", handleError("Unable to get all tags",
Bundle.PortableCaseReportModule_generateReport_errorReadingTags(), ex, progressPanel); // NON-NLS Bundle.PortableCaseReportModule_generateReport_errorReadingTags(), ex, progressPanel); // NON-NLS
return; return;
} }
} else { } else {
@ -256,7 +264,7 @@ public class PortableCaseReportModule implements ReportModule {
setNames = getAllInterestingItemsSets(); setNames = getAllInterestingItemsSets();
} catch (NoCurrentCaseException | TskCoreException ex) { } catch (NoCurrentCaseException | TskCoreException ex) {
handleError("Unable to get all interesting items sets", handleError("Unable to get all interesting items sets",
Bundle.PortableCaseReportModule_generateReport_errorReadingSets(), ex, progressPanel); // NON-NLS Bundle.PortableCaseReportModule_generateReport_errorReadingSets(), ex, progressPanel); // NON-NLS
return; return;
} }
} else { } else {
@ -295,7 +303,7 @@ public class PortableCaseReportModule implements ReportModule {
// Copy the selected tags // Copy the selected tags
progressPanel.updateStatusLabel(Bundle.PortableCaseReportModule_generateReport_copyingTags()); progressPanel.updateStatusLabel(Bundle.PortableCaseReportModule_generateReport_copyingTags());
try { try {
for(TagName tagName:tagNames) { for (TagName tagName : tagNames) {
TagName newTagName = portableSkCase.addOrUpdateTagName(tagName.getDisplayName(), tagName.getDescription(), tagName.getColor(), tagName.getKnownStatus()); TagName newTagName = portableSkCase.addOrUpdateTagName(tagName.getDisplayName(), tagName.getDescription(), tagName.getColor(), tagName.getKnownStatus());
oldTagNameToNewTagName.put(tagName, newTagName); oldTagNameToNewTagName.put(tagName, newTagName);
} }
@ -305,10 +313,10 @@ public class PortableCaseReportModule implements ReportModule {
} }
// Set up tracking to support any custom artifact or attribute types // Set up tracking to support any custom artifact or attribute types
for (BlackboardArtifact.ARTIFACT_TYPE type:BlackboardArtifact.ARTIFACT_TYPE.values()) { for (BlackboardArtifact.ARTIFACT_TYPE type : BlackboardArtifact.ARTIFACT_TYPE.values()) {
oldArtTypeIdToNewArtTypeId.put(type.getTypeID(), type.getTypeID()); oldArtTypeIdToNewArtTypeId.put(type.getTypeID(), type.getTypeID());
} }
for (BlackboardAttribute.ATTRIBUTE_TYPE type:BlackboardAttribute.ATTRIBUTE_TYPE.values()) { for (BlackboardAttribute.ATTRIBUTE_TYPE type : BlackboardAttribute.ATTRIBUTE_TYPE.values()) {
try { try {
oldAttrTypeIdToNewAttrType.put(type.getTypeID(), portableSkCase.getAttributeType(type.getLabel())); oldAttrTypeIdToNewAttrType.put(type.getTypeID(), portableSkCase.getAttributeType(type.getLabel()));
} catch (TskCoreException ex) { } catch (TskCoreException ex) {
@ -320,7 +328,7 @@ public class PortableCaseReportModule implements ReportModule {
// Copy the tagged files // Copy the tagged files
try { try {
for(TagName tagName:tagNames) { for (TagName tagName : tagNames) {
// Check for cancellation // Check for cancellation
if (progressPanel.getStatus() == ReportProgressPanel.ReportStatus.CANCELED) { if (progressPanel.getStatus() == ReportProgressPanel.ReportStatus.CANCELED) {
handleCancellation(progressPanel); handleCancellation(progressPanel);
@ -342,7 +350,7 @@ public class PortableCaseReportModule implements ReportModule {
// Copy the tagged artifacts and associated files // Copy the tagged artifacts and associated files
try { try {
for(TagName tagName:tagNames) { for (TagName tagName : tagNames) {
// Check for cancellation // Check for cancellation
if (progressPanel.getStatus() == ReportProgressPanel.ReportStatus.CANCELED) { if (progressPanel.getStatus() == ReportProgressPanel.ReportStatus.CANCELED) {
handleCancellation(progressPanel); handleCancellation(progressPanel);
@ -363,10 +371,10 @@ public class PortableCaseReportModule implements ReportModule {
} }
// Copy interesting files and results // Copy interesting files and results
if (! setNames.isEmpty()) { if (!setNames.isEmpty()) {
try { try {
List<BlackboardArtifact> interestingFiles = currentCase.getSleuthkitCase().getBlackboardArtifacts(BlackboardArtifact.ARTIFACT_TYPE.TSK_INTERESTING_FILE_HIT); List<BlackboardArtifact> interestingFiles = currentCase.getSleuthkitCase().getBlackboardArtifacts(BlackboardArtifact.ARTIFACT_TYPE.TSK_INTERESTING_FILE_HIT);
for (BlackboardArtifact art:interestingFiles) { for (BlackboardArtifact art : interestingFiles) {
// Check for cancellation // Check for cancellation
if (progressPanel.getStatus() == ReportProgressPanel.ReportStatus.CANCELED) { if (progressPanel.getStatus() == ReportProgressPanel.ReportStatus.CANCELED) {
handleCancellation(progressPanel); handleCancellation(progressPanel);
@ -385,7 +393,7 @@ public class PortableCaseReportModule implements ReportModule {
try { try {
List<BlackboardArtifact> interestingResults = currentCase.getSleuthkitCase().getBlackboardArtifacts(BlackboardArtifact.ARTIFACT_TYPE.TSK_INTERESTING_ARTIFACT_HIT); List<BlackboardArtifact> interestingResults = currentCase.getSleuthkitCase().getBlackboardArtifacts(BlackboardArtifact.ARTIFACT_TYPE.TSK_INTERESTING_ARTIFACT_HIT);
for (BlackboardArtifact art:interestingResults) { for (BlackboardArtifact art : interestingResults) {
// Check for cancellation // Check for cancellation
if (progressPanel.getStatus() == ReportProgressPanel.ReportStatus.CANCELED) { if (progressPanel.getStatus() == ReportProgressPanel.ReportStatus.CANCELED) {
handleCancellation(progressPanel); handleCancellation(progressPanel);
@ -423,7 +431,7 @@ public class PortableCaseReportModule implements ReportModule {
return; return;
} }
if (! success) { if (!success) {
// Errors have been handled already // Errors have been handled already
return; return;
} }
@ -456,17 +464,23 @@ public class PortableCaseReportModule implements ReportModule {
private void generateCaseUcoReport(List<TagName> tagNames, List<String> setNames, ReportProgressPanel progressPanel) { private void generateCaseUcoReport(List<TagName> tagNames, List<String> setNames, ReportProgressPanel progressPanel) {
//Create the 'Reports' directory to include a CASE-UCO report. //Create the 'Reports' directory to include a CASE-UCO report.
Path reportsDirectory = Paths.get(caseFolder.toString(), "Reports"); Path reportsDirectory = Paths.get(caseFolder.toString(), "Reports");
if(!reportsDirectory.toFile().mkdir()) { if (!reportsDirectory.toFile().mkdir()) {
logger.log(Level.SEVERE, "Could not make the report folder... skipping " logger.log(Level.SEVERE, "Could not make the report folder... skipping "
+ "CASE-UCO report generation for the portable case"); + "CASE-UCO report generation for the portable case");
return; return;
} }
try { Path reportFile = reportsDirectory.resolve(CASE_UCO_FILE_NAME);
//Try to generate case uco output.
progressPanel.updateStatusLabel(Bundle.PortableCaseReportModule_generateCaseUcoReport_startCaseUcoReportGeneration()); progressPanel.updateStatusLabel(Bundle.PortableCaseReportModule_generateCaseUcoReport_startCaseUcoReportGeneration());
CaseUcoReportGenerator reportGenerator = new CaseUcoReportGenerator(reportsDirectory, CASE_UCO_FILE_NAME); try (OutputStream stream = new FileOutputStream(reportFile.toFile());
//Acquire references for file discovery JsonWriter reportWriter = new JsonWriter(new OutputStreamWriter(stream, "UTF-8"))) {
Gson gson = new GsonBuilder().setPrettyPrinting().create();
reportWriter.setIndent(" ");
reportWriter.beginObject();
reportWriter.name("@graph");
reportWriter.beginArray();
String caseTempDirectory = currentCase.getTempDirectory(); String caseTempDirectory = currentCase.getTempDirectory();
SleuthkitCase skCase = currentCase.getSleuthkitCase(); SleuthkitCase skCase = currentCase.getSleuthkitCase();
TagsManager tagsManager = currentCase.getServices().getTagsManager(); TagsManager tagsManager = currentCase.getServices().getTagsManager();
@ -477,7 +491,10 @@ public class PortableCaseReportModule implements ReportModule {
FileUtils.deleteDirectory(tmpDir.toFile()); FileUtils.deleteDirectory(tmpDir.toFile());
Files.createDirectory(tmpDir); Files.createDirectory(tmpDir);
reportGenerator.addCase(currentCase); CaseUcoExporter exporter = new CaseUcoExporter(currentCase.getSleuthkitCase());
for (JsonElement element : exporter.exportSleuthkitCase()) {
gson.toJson(element, reportWriter);
}
//Load all interesting BlackboardArtifacts that belong to the selected SET_NAMEs //Load all interesting BlackboardArtifacts that belong to the selected SET_NAMEs
//binned by data source id. //binned by data source id.
@ -485,35 +502,33 @@ public class PortableCaseReportModule implements ReportModule {
//Search each data source looking for content tags and interesting //Search each data source looking for content tags and interesting
//items that match the selected tag names and set names. //items that match the selected tag names and set names.
for (Content dataSource : currentCase.getDataSources()) { for (DataSource dataSource : currentCase.getSleuthkitCase().getDataSources()) {
/** // Helper flag to ensure each data source is only written once in
* It is currently believed that DataSources in a CASE-UCO report // a report.
* should precede all file entities. Therefore, before
* writing a file, add the data source if it
* has yet to be included.
*/
boolean dataSourceHasBeenIncluded = false; boolean dataSourceHasBeenIncluded = false;
//Search content tags and artifact tags that match //Search content tags and artifact tags that match
for (TagName tagName : tagNames) { for (TagName tagName : tagNames) {
for (ContentTag ct : tagsManager.getContentTagsByTagName(tagName, dataSource.getId())) { for (ContentTag ct : tagsManager.getContentTagsByTagName(tagName, dataSource.getId())) {
dataSourceHasBeenIncluded |= addUniqueFile(ct.getContent(), dataSourceHasBeenIncluded |= addUniqueFile(ct.getContent(),
dataSource, tmpDir, reportGenerator, dataSourceHasBeenIncluded); dataSource, tmpDir, gson, exporter, reportWriter, dataSourceHasBeenIncluded);
} }
for (BlackboardArtifactTag bat : tagsManager.getBlackboardArtifactTagsByTagName(tagName, dataSource.getId())) { for (BlackboardArtifactTag bat : tagsManager.getBlackboardArtifactTagsByTagName(tagName, dataSource.getId())) {
dataSourceHasBeenIncluded |= addUniqueFile(bat.getContent(), dataSourceHasBeenIncluded |= addUniqueFile(bat.getContent(),
dataSource, tmpDir, reportGenerator, dataSourceHasBeenIncluded); dataSource, tmpDir, gson, exporter, reportWriter, dataSourceHasBeenIncluded);
} }
} }
//Search artifacts that this data source contains //Search artifacts that this data source contains
for(BlackboardArtifact bArt : artifactsWithSetName.get(dataSource.getId())) { for (BlackboardArtifact bArt : artifactsWithSetName.get(dataSource.getId())) {
Content sourceContent = bArt.getParent(); Content sourceContent = bArt.getParent();
dataSourceHasBeenIncluded |= addUniqueFile(sourceContent, dataSource, dataSourceHasBeenIncluded |= addUniqueFile(sourceContent, dataSource,
tmpDir, reportGenerator, dataSourceHasBeenIncluded); tmpDir, gson, exporter, reportWriter, dataSourceHasBeenIncluded);
} }
} }
//Create the report. // Finish the report.
reportGenerator.generateReport(); reportWriter.endArray();
reportWriter.endObject();
progressPanel.updateStatusLabel(Bundle.PortableCaseReportModule_generateCaseUcoReport_successCaseUcoReportGeneration()); progressPanel.updateStatusLabel(Bundle.PortableCaseReportModule_generateCaseUcoReport_successCaseUcoReportGeneration());
} catch (IOException | TskCoreException ex) { } catch (IOException | TskCoreException ex) {
progressPanel.updateStatusLabel(Bundle.PortableCaseReportModule_generateCaseUcoReport_errorGeneratingCaseUcoReport()); progressPanel.updateStatusLabel(Bundle.PortableCaseReportModule_generateCaseUcoReport_errorGeneratingCaseUcoReport());
@ -530,15 +545,15 @@ public class PortableCaseReportModule implements ReportModule {
*/ */
private Multimap<Long, BlackboardArtifact> getInterestingArtifactsBySetName(SleuthkitCase skCase, List<String> setNames) throws TskCoreException { private Multimap<Long, BlackboardArtifact> getInterestingArtifactsBySetName(SleuthkitCase skCase, List<String> setNames) throws TskCoreException {
Multimap<Long, BlackboardArtifact> artifactsWithSetName = ArrayListMultimap.create(); Multimap<Long, BlackboardArtifact> artifactsWithSetName = ArrayListMultimap.create();
if(!setNames.isEmpty()) { if (!setNames.isEmpty()) {
List<BlackboardArtifact> allArtifacts = skCase.getBlackboardArtifacts( List<BlackboardArtifact> allArtifacts = skCase.getBlackboardArtifacts(
BlackboardArtifact.ARTIFACT_TYPE.TSK_INTERESTING_FILE_HIT); BlackboardArtifact.ARTIFACT_TYPE.TSK_INTERESTING_FILE_HIT);
allArtifacts.addAll(skCase.getBlackboardArtifacts( allArtifacts.addAll(skCase.getBlackboardArtifacts(
BlackboardArtifact.ARTIFACT_TYPE.TSK_INTERESTING_ARTIFACT_HIT)); BlackboardArtifact.ARTIFACT_TYPE.TSK_INTERESTING_ARTIFACT_HIT));
for(BlackboardArtifact bArt : allArtifacts) { for (BlackboardArtifact bArt : allArtifacts) {
BlackboardAttribute setAttr = bArt.getAttribute( BlackboardAttribute setAttr = bArt.getAttribute(
new BlackboardAttribute.Type(BlackboardAttribute.ATTRIBUTE_TYPE.TSK_SET_NAME)); new BlackboardAttribute.Type(BlackboardAttribute.ATTRIBUTE_TYPE.TSK_SET_NAME));
if (setNames.contains(setAttr.getValueString())) { if (setNames.contains(setAttr.getValueString())) {
artifactsWithSetName.put(bArt.getDataSource().getId(), bArt); artifactsWithSetName.put(bArt.getDataSource().getId(), bArt);
} }
@ -553,27 +568,34 @@ public class PortableCaseReportModule implements ReportModule {
* @param content Content to add to the report. * @param content Content to add to the report.
* @param dataSource Parent dataSource of the content instance. * @param dataSource Parent dataSource of the content instance.
* @param tmpDir Path to the tmpDir to enforce uniqueness * @param tmpDir Path to the tmpDir to enforce uniqueness
* @param reportGenerator Report generator instance to add the content to * @param gson
* @param exporter
* @param reportWriter Report generator instance to add the content to
* @param dataSourceHasBeenIncluded Flag determining if the data source * @param dataSourceHasBeenIncluded Flag determining if the data source
* should be written before the file. False will cause the data source to be written. * should be written to the report (false indicates that it should be written).
*
* @throws IOException If an I/O error occurs. * @throws IOException If an I/O error occurs.
* @throws TskCoreException If an internal database error occurs. * @throws TskCoreException If an internal database error occurs.
* *
* return True if the data source was written during this operation. * return True if the file was written during this operation.
*/ */
private boolean addUniqueFile(Content content, Content dataSource, private boolean addUniqueFile(Content content, DataSource dataSource,
Path tmpDir, CaseUcoReportGenerator reportGenerator, Path tmpDir, Gson gson, CaseUcoExporter exporter, JsonWriter reportWriter,
boolean dataSourceHasBeenIncluded) throws IOException, TskCoreException { boolean dataSourceHasBeenIncluded) throws IOException, TskCoreException {
if (content instanceof AbstractFile && !(content instanceof DataSource)) { if (content instanceof AbstractFile && !(content instanceof DataSource)) {
AbstractFile absFile = (AbstractFile) content; AbstractFile absFile = (AbstractFile) content;
Path filePath = tmpDir.resolve(Long.toString(absFile.getId())); Path filePath = tmpDir.resolve(Long.toString(absFile.getId()));
if (!absFile.isDir() && !Files.exists(filePath)) { if (!absFile.isDir() && !Files.exists(filePath)) {
if(!dataSourceHasBeenIncluded) { if (!dataSourceHasBeenIncluded) {
reportGenerator.addDataSource(dataSource, currentCase); for (JsonElement element : exporter.exportDataSource(dataSource)) {
gson.toJson(element, reportWriter);
}
} }
String subFolder = getExportSubfolder(absFile); String subFolder = getExportSubfolder(absFile);
String fileName = absFile.getId() + "-" + FileUtil.escapeFileName(absFile.getName()); String fileName = absFile.getId() + "-" + FileUtil.escapeFileName(absFile.getName());
reportGenerator.addFile(absFile, dataSource, Paths.get(FILE_FOLDER_NAME, subFolder, fileName)); for (JsonElement element : exporter.exportAbstractFile(absFile, Paths.get(FILE_FOLDER_NAME, subFolder, fileName).toString())) {
gson.toJson(element, reportWriter);
}
Files.createFile(filePath); Files.createFile(filePath);
return true; return true;
} }
@ -604,12 +626,11 @@ public class PortableCaseReportModule implements ReportModule {
return setNames; return setNames;
} }
/** /**
* Create the case directory and case database. * Create the case directory and case database. portableSkCase will be set
* portableSkCase will be set if this completes without error. * if this completes without error.
* *
* @param outputDir The parent for the case folder * @param outputDir The parent for the case folder
* @param progressPanel * @param progressPanel
*/ */
@NbBundle.Messages({ @NbBundle.Messages({
@ -618,8 +639,7 @@ public class PortableCaseReportModule implements ReportModule {
"PortableCaseReportModule.createCase.errorCreatingCase=Error creating case", "PortableCaseReportModule.createCase.errorCreatingCase=Error creating case",
"# {0} - folder", "# {0} - folder",
"PortableCaseReportModule.createCase.errorCreatingFolder=Error creating folder {0}", "PortableCaseReportModule.createCase.errorCreatingFolder=Error creating folder {0}",
"PortableCaseReportModule.createCase.errorStoringMaxIds=Error storing maximum database IDs", "PortableCaseReportModule.createCase.errorStoringMaxIds=Error storing maximum database IDs",})
})
private void createCase(File outputDir, ReportProgressPanel progressPanel) { private void createCase(File outputDir, ReportProgressPanel progressPanel) {
// Create the case folder // Create the case folder
@ -627,7 +647,7 @@ public class PortableCaseReportModule implements ReportModule {
if (caseFolder.exists()) { if (caseFolder.exists()) {
handleError("Case folder " + caseFolder.toString() + " already exists", handleError("Case folder " + caseFolder.toString() + " already exists",
Bundle.PortableCaseReportModule_createCase_caseDirExists(caseFolder.toString()), null, progressPanel); // NON-NLS Bundle.PortableCaseReportModule_createCase_caseDirExists(caseFolder.toString()), null, progressPanel); // NON-NLS
return; return;
} }
@ -636,7 +656,7 @@ public class PortableCaseReportModule implements ReportModule {
portableSkCase = currentCase.createPortableCase(caseName, caseFolder); portableSkCase = currentCase.createPortableCase(caseName, caseFolder);
} catch (TskCoreException ex) { } catch (TskCoreException ex) {
handleError("Error creating case " + caseName + " in folder " + caseFolder.toString(), handleError("Error creating case " + caseName + " in folder " + caseFolder.toString(),
Bundle.PortableCaseReportModule_createCase_errorCreatingCase(), ex, progressPanel); // NON-NLS Bundle.PortableCaseReportModule_createCase_errorCreatingCase(), ex, progressPanel); // NON-NLS
return; return;
} }
@ -645,31 +665,31 @@ public class PortableCaseReportModule implements ReportModule {
saveHighestIds(); saveHighestIds();
} catch (TskCoreException ex) { } catch (TskCoreException ex) {
handleError("Error storing maximum database IDs", handleError("Error storing maximum database IDs",
Bundle.PortableCaseReportModule_createCase_errorStoringMaxIds(), ex, progressPanel); // NON-NLS Bundle.PortableCaseReportModule_createCase_errorStoringMaxIds(), ex, progressPanel); // NON-NLS
return; return;
} }
// Create the base folder for the copied files // Create the base folder for the copied files
copiedFilesFolder = Paths.get(caseFolder.toString(), FILE_FOLDER_NAME).toFile(); copiedFilesFolder = Paths.get(caseFolder.toString(), FILE_FOLDER_NAME).toFile();
if (! copiedFilesFolder.mkdir()) { if (!copiedFilesFolder.mkdir()) {
handleError("Error creating folder " + copiedFilesFolder.toString(), handleError("Error creating folder " + copiedFilesFolder.toString(),
Bundle.PortableCaseReportModule_createCase_errorCreatingFolder(copiedFilesFolder.toString()), null, progressPanel); // NON-NLS Bundle.PortableCaseReportModule_createCase_errorCreatingFolder(copiedFilesFolder.toString()), null, progressPanel); // NON-NLS
return; return;
} }
// Create subfolders for the copied files // Create subfolders for the copied files
for (FileTypeCategory cat:FILE_TYPE_CATEGORIES) { for (FileTypeCategory cat : FILE_TYPE_CATEGORIES) {
File subFolder = Paths.get(copiedFilesFolder.toString(), cat.getDisplayName()).toFile(); File subFolder = Paths.get(copiedFilesFolder.toString(), cat.getDisplayName()).toFile();
if (! subFolder.mkdir()) { if (!subFolder.mkdir()) {
handleError("Error creating folder " + subFolder.toString(), handleError("Error creating folder " + subFolder.toString(),
Bundle.PortableCaseReportModule_createCase_errorCreatingFolder(subFolder.toString()), null, progressPanel); // NON-NLS Bundle.PortableCaseReportModule_createCase_errorCreatingFolder(subFolder.toString()), null, progressPanel); // NON-NLS
return; return;
} }
} }
File unknownTypeFolder = Paths.get(copiedFilesFolder.toString(), UNKNOWN_FILE_TYPE_FOLDER).toFile(); File unknownTypeFolder = Paths.get(copiedFilesFolder.toString(), UNKNOWN_FILE_TYPE_FOLDER).toFile();
if (! unknownTypeFolder.mkdir()) { if (!unknownTypeFolder.mkdir()) {
handleError("Error creating folder " + unknownTypeFolder.toString(), handleError("Error creating folder " + unknownTypeFolder.toString(),
Bundle.PortableCaseReportModule_createCase_errorCreatingFolder(unknownTypeFolder.toString()), null, progressPanel); // NON-NLS Bundle.PortableCaseReportModule_createCase_errorCreatingFolder(unknownTypeFolder.toString()), null, progressPanel); // NON-NLS
return; return;
} }
@ -685,7 +705,7 @@ public class PortableCaseReportModule implements ReportModule {
CaseDbAccessManager currentCaseDbManager = currentCase.getSleuthkitCase().getCaseDbAccessManager(); CaseDbAccessManager currentCaseDbManager = currentCase.getSleuthkitCase().getCaseDbAccessManager();
String tableSchema = "( table_name TEXT PRIMARY KEY, " String tableSchema = "( table_name TEXT PRIMARY KEY, "
+ " max_id TEXT)"; // NON-NLS + " max_id TEXT)"; // NON-NLS
portableSkCase.getCaseDbAccessManager().createTable(MAX_ID_TABLE_NAME, tableSchema); portableSkCase.getCaseDbAccessManager().createTable(MAX_ID_TABLE_NAME, tableSchema);
@ -706,7 +726,7 @@ public class PortableCaseReportModule implements ReportModule {
// Create the image tags table in the portable case // Create the image tags table in the portable case
CaseDbAccessManager portableDbAccessManager = portableSkCase.getCaseDbAccessManager(); CaseDbAccessManager portableDbAccessManager = portableSkCase.getCaseDbAccessManager();
if (! portableDbAccessManager.tableExists(ContentViewerTagManager.TABLE_NAME)) { if (!portableDbAccessManager.tableExists(ContentViewerTagManager.TABLE_NAME)) {
portableDbAccessManager.createTable(ContentViewerTagManager.TABLE_NAME, ContentViewerTagManager.TABLE_SCHEMA_SQLITE); portableDbAccessManager.createTable(ContentViewerTagManager.TABLE_NAME, ContentViewerTagManager.TABLE_SCHEMA_SQLITE);
} }
} }
@ -714,7 +734,7 @@ public class PortableCaseReportModule implements ReportModule {
/** /**
* Add all files with a given tag to the portable case. * Add all files with a given tag to the portable case.
* *
* @param oldTagName The TagName object from the current case * @param oldTagName The TagName object from the current case
* @param progressPanel The progress panel * @param progressPanel The progress panel
* *
* @throws TskCoreException * @throws TskCoreException
@ -738,7 +758,7 @@ public class PortableCaseReportModule implements ReportModule {
long newFileId = copyContentToPortableCase(content, progressPanel); long newFileId = copyContentToPortableCase(content, progressPanel);
// Tag the file // Tag the file
if (! oldTagNameToNewTagName.containsKey(tag.getName())) { if (!oldTagNameToNewTagName.containsKey(tag.getName())) {
throw new TskCoreException("TagName map is missing entry for ID " + tag.getName().getId() + " with display name " + tag.getName().getDisplayName()); // NON-NLS throw new TskCoreException("TagName map is missing entry for ID " + tag.getName().getId() + " with display name " + tag.getName().getDisplayName()); // NON-NLS
} }
ContentTagChange newContentTag = portableSkCase.getTaggingManager().addContentTag(newIdToContent.get(newFileId), oldTagNameToNewTagName.get(tag.getName()), tag.getComment(), tag.getBeginByteOffset(), tag.getEndByteOffset()); ContentTagChange newContentTag = portableSkCase.getTaggingManager().addContentTag(newIdToContent.get(newFileId), oldTagNameToNewTagName.get(tag.getName()), tag.getComment(), tag.getBeginByteOffset(), tag.getEndByteOffset());
@ -746,7 +766,7 @@ public class PortableCaseReportModule implements ReportModule {
// Get the image tag data associated with this tag (empty string if there is none) // Get the image tag data associated with this tag (empty string if there is none)
// and save it if present // and save it if present
String appData = getImageTagDataForContentTag(tag); String appData = getImageTagDataForContentTag(tag);
if (! appData.isEmpty()) { if (!appData.isEmpty()) {
addImageTagToPortableCase(newContentTag.getAddedTag(), appData); addImageTagToPortableCase(newContentTag.getAddedTag(), appData);
} }
} }
@ -758,7 +778,8 @@ public class PortableCaseReportModule implements ReportModule {
* *
* @param tag The ContentTag in the current case * @param tag The ContentTag in the current case
* *
* @return The app_data string for this content tag or an empty string if there was none * @return The app_data string for this content tag or an empty string if
* there was none
* *
* @throws TskCoreException * @throws TskCoreException
*/ */
@ -807,7 +828,7 @@ public class PortableCaseReportModule implements ReportModule {
* Add an image tag to the portable case. * Add an image tag to the portable case.
* *
* @param newContentTag The content tag in the portable case * @param newContentTag The content tag in the portable case
* @param appData The string to copy into app_data * @param appData The string to copy into app_data
* *
* @throws TskCoreException * @throws TskCoreException
*/ */
@ -816,11 +837,10 @@ public class PortableCaseReportModule implements ReportModule {
portableSkCase.getCaseDbAccessManager().insert(ContentViewerTagManager.TABLE_NAME, insert); portableSkCase.getCaseDbAccessManager().insert(ContentViewerTagManager.TABLE_NAME, insert);
} }
/** /**
* Add all artifacts with a given tag to the portable case. * Add all artifacts with a given tag to the portable case.
* *
* @param oldTagName The TagName object from the current case * @param oldTagName The TagName object from the current case
* @param progressPanel The progress panel * @param progressPanel The progress panel
* *
* @throws TskCoreException * @throws TskCoreException
@ -845,7 +865,7 @@ public class PortableCaseReportModule implements ReportModule {
BlackboardArtifact newArtifact = copyArtifact(newContentId, tag.getArtifact()); BlackboardArtifact newArtifact = copyArtifact(newContentId, tag.getArtifact());
// Tag the artfiact // Tag the artfiact
if (! oldTagNameToNewTagName.containsKey(tag.getName())) { if (!oldTagNameToNewTagName.containsKey(tag.getName())) {
throw new TskCoreException("TagName map is missing entry for ID " + tag.getName().getId() + " with display name " + tag.getName().getDisplayName()); // NON-NLS throw new TskCoreException("TagName map is missing entry for ID " + tag.getName().getId() + " with display name " + tag.getName().getDisplayName()); // NON-NLS
} }
portableSkCase.getTaggingManager().addArtifactTag(newArtifact, oldTagNameToNewTagName.get(tag.getName()), tag.getComment()); portableSkCase.getTaggingManager().addArtifactTag(newArtifact, oldTagNameToNewTagName.get(tag.getName()), tag.getComment());
@ -853,9 +873,11 @@ public class PortableCaseReportModule implements ReportModule {
} }
/** /**
* Copy an artifact into the new case. Will also copy any associated artifacts * Copy an artifact into the new case. Will also copy any associated
* artifacts
* *
* @param newContentId The content ID (in the portable case) of the source content * @param newContentId The content ID (in the portable case) of the source
* content
* @param artifactToCopy The artifact to copy * @param artifactToCopy The artifact to copy
* *
* @return The new artifact in the portable case * @return The new artifact in the portable case
@ -875,7 +897,7 @@ public class PortableCaseReportModule implements ReportModule {
BlackboardArtifact oldAssociatedArtifact = currentCase.getSleuthkitCase().getBlackboardArtifact(oldAssociatedAttribute.getValueLong()); BlackboardArtifact oldAssociatedArtifact = currentCase.getSleuthkitCase().getBlackboardArtifact(oldAssociatedAttribute.getValueLong());
BlackboardArtifact newAssociatedArtifact = copyArtifact(newContentId, oldAssociatedArtifact); BlackboardArtifact newAssociatedArtifact = copyArtifact(newContentId, oldAssociatedArtifact);
newAttrs.add(new BlackboardAttribute(BlackboardAttribute.ATTRIBUTE_TYPE.TSK_ASSOCIATED_ARTIFACT, newAttrs.add(new BlackboardAttribute(BlackboardAttribute.ATTRIBUTE_TYPE.TSK_ASSOCIATED_ARTIFACT,
String.join(",", oldAssociatedAttribute.getSources()), newAssociatedArtifact.getArtifactID())); String.join(",", oldAssociatedAttribute.getSources()), newAssociatedArtifact.getArtifactID()));
} }
// Create the new artifact // Create the new artifact
@ -884,7 +906,7 @@ public class PortableCaseReportModule implements ReportModule {
List<BlackboardAttribute> oldAttrs = artifactToCopy.getAttributes(); List<BlackboardAttribute> oldAttrs = artifactToCopy.getAttributes();
// Copy over each attribute, making sure the type is in the new case. // Copy over each attribute, making sure the type is in the new case.
for (BlackboardAttribute oldAttr:oldAttrs) { for (BlackboardAttribute oldAttr : oldAttrs) {
// The associated artifact has already been handled // The associated artifact has already been handled
if (oldAttr.getAttributeType().getTypeID() == BlackboardAttribute.ATTRIBUTE_TYPE.TSK_ASSOCIATED_ARTIFACT.getTypeID()) { if (oldAttr.getAttributeType().getTypeID() == BlackboardAttribute.ATTRIBUTE_TYPE.TSK_ASSOCIATED_ARTIFACT.getTypeID()) {
@ -927,8 +949,9 @@ public class PortableCaseReportModule implements ReportModule {
} }
/** /**
* Get the artifact type ID in the portable case and create new artifact type if needed. * Get the artifact type ID in the portable case and create new artifact
* For built-in artifacts this will be the same as the original. * type if needed. For built-in artifacts this will be the same as the
* original.
* *
* @param oldArtifact The artifact in the current case * @param oldArtifact The artifact in the current case
* *
@ -950,8 +973,9 @@ public class PortableCaseReportModule implements ReportModule {
} }
/** /**
* Get the attribute type ID in the portable case and create new attribute type if needed. * Get the attribute type ID in the portable case and create new attribute
* For built-in attributes this will be the same as the original. * type if needed. For built-in attributes this will be the same as the
* original.
* *
* @param oldAttribute The attribute in the current case * @param oldAttribute The attribute in the current case
* *
@ -976,7 +1000,7 @@ public class PortableCaseReportModule implements ReportModule {
/** /**
* Top level method to copy a content object to the portable case. * Top level method to copy a content object to the portable case.
* *
* @param content The content object to copy * @param content The content object to copy
* @param progressPanel The progress panel * @param progressPanel The progress panel
* *
* @return The object ID of the copied content in the portable case * @return The object ID of the copied content in the portable case
@ -985,8 +1009,7 @@ public class PortableCaseReportModule implements ReportModule {
*/ */
@NbBundle.Messages({ @NbBundle.Messages({
"# {0} - File name", "# {0} - File name",
"PortableCaseReportModule.copyContentToPortableCase.copyingFile=Copying file {0}", "PortableCaseReportModule.copyContentToPortableCase.copyingFile=Copying file {0}",})
})
private long copyContentToPortableCase(Content content, ReportProgressPanel progressPanel) throws TskCoreException { private long copyContentToPortableCase(Content content, ReportProgressPanel progressPanel) throws TskCoreException {
progressPanel.updateStatusLabel(Bundle.PortableCaseReportModule_copyContentToPortableCase_copyingFile(content.getUniquePath())); progressPanel.updateStatusLabel(Bundle.PortableCaseReportModule_copyContentToPortableCase_copyingFile(content.getUniquePath()));
return copyContent(content); return copyContent(content);
@ -1018,38 +1041,38 @@ public class PortableCaseReportModule implements ReportModule {
Content newContent; Content newContent;
if (content instanceof BlackboardArtifact) { if (content instanceof BlackboardArtifact) {
BlackboardArtifact artifactToCopy = (BlackboardArtifact)content; BlackboardArtifact artifactToCopy = (BlackboardArtifact) content;
newContent = copyArtifact(parentId, artifactToCopy); newContent = copyArtifact(parentId, artifactToCopy);
} else { } else {
CaseDbTransaction trans = portableSkCase.beginTransaction(); CaseDbTransaction trans = portableSkCase.beginTransaction();
try { try {
if (content instanceof Image) { if (content instanceof Image) {
Image image = (Image)content; Image image = (Image) content;
newContent = portableSkCase.addImage(image.getType(), image.getSsize(), image.getSize(), image.getName(), newContent = portableSkCase.addImage(image.getType(), image.getSsize(), image.getSize(), image.getName(),
new ArrayList<>(), image.getTimeZone(), image.getMd5(), image.getSha1(), image.getSha256(), image.getDeviceId(), trans); new ArrayList<>(), image.getTimeZone(), image.getMd5(), image.getSha1(), image.getSha256(), image.getDeviceId(), trans);
} else if (content instanceof VolumeSystem) { } else if (content instanceof VolumeSystem) {
VolumeSystem vs = (VolumeSystem)content; VolumeSystem vs = (VolumeSystem) content;
newContent = portableSkCase.addVolumeSystem(parentId, vs.getType(), vs.getOffset(), vs.getBlockSize(), trans); newContent = portableSkCase.addVolumeSystem(parentId, vs.getType(), vs.getOffset(), vs.getBlockSize(), trans);
} else if (content instanceof Volume) { } else if (content instanceof Volume) {
Volume vs = (Volume)content; Volume vs = (Volume) content;
newContent = portableSkCase.addVolume(parentId, vs.getAddr(), vs.getStart(), vs.getLength(), newContent = portableSkCase.addVolume(parentId, vs.getAddr(), vs.getStart(), vs.getLength(),
vs.getDescription(), vs.getFlags(), trans); vs.getDescription(), vs.getFlags(), trans);
} else if (content instanceof Pool) { } else if (content instanceof Pool) {
Pool pool = (Pool)content; Pool pool = (Pool) content;
newContent = portableSkCase.addPool(parentId, pool.getType(), trans); newContent = portableSkCase.addPool(parentId, pool.getType(), trans);
} else if (content instanceof FileSystem) { } else if (content instanceof FileSystem) {
FileSystem fs = (FileSystem)content; FileSystem fs = (FileSystem) content;
newContent = portableSkCase.addFileSystem(parentId, fs.getImageOffset(), fs.getFsType(), fs.getBlock_size(), newContent = portableSkCase.addFileSystem(parentId, fs.getImageOffset(), fs.getFsType(), fs.getBlock_size(),
fs.getBlock_count(), fs.getRoot_inum(), fs.getFirst_inum(), fs.getLastInum(), fs.getBlock_count(), fs.getRoot_inum(), fs.getFirst_inum(), fs.getLastInum(),
fs.getName(), trans); fs.getName(), trans);
} else if (content instanceof BlackboardArtifact) { } else if (content instanceof BlackboardArtifact) {
BlackboardArtifact artifactToCopy = (BlackboardArtifact)content; BlackboardArtifact artifactToCopy = (BlackboardArtifact) content;
newContent = copyArtifact(parentId, artifactToCopy); newContent = copyArtifact(parentId, artifactToCopy);
} else if (content instanceof AbstractFile) { } else if (content instanceof AbstractFile) {
AbstractFile abstractFile = (AbstractFile)content; AbstractFile abstractFile = (AbstractFile) content;
if (abstractFile instanceof LocalFilesDataSource) { if (abstractFile instanceof LocalFilesDataSource) {
LocalFilesDataSource localFilesDS = (LocalFilesDataSource)abstractFile; LocalFilesDataSource localFilesDS = (LocalFilesDataSource) abstractFile;
newContent = portableSkCase.addLocalFilesDataSource(localFilesDS.getDeviceId(), localFilesDS.getName(), localFilesDS.getTimeZone(), trans); newContent = portableSkCase.addLocalFilesDataSource(localFilesDS.getDeviceId(), localFilesDS.getName(), localFilesDS.getTimeZone(), trans);
} else { } else {
if (abstractFile.isDir()) { if (abstractFile.isDir()) {
@ -1065,13 +1088,13 @@ public class PortableCaseReportModule implements ReportModule {
// Get the new parent object in the portable case database // Get the new parent object in the portable case database
Content oldParent = abstractFile.getParent(); Content oldParent = abstractFile.getParent();
if (! oldIdToNewContent.containsKey(oldParent.getId())) { if (!oldIdToNewContent.containsKey(oldParent.getId())) {
throw new TskCoreException("Parent of file with ID " + abstractFile.getId() + " has not been created"); // NON-NLS throw new TskCoreException("Parent of file with ID " + abstractFile.getId() + " has not been created"); // NON-NLS
} }
Content newParent = oldIdToNewContent.get(oldParent.getId()); Content newParent = oldIdToNewContent.get(oldParent.getId());
// Construct the relative path to the copied file // Construct the relative path to the copied file
String relativePath = FILE_FOLDER_NAME + File.separator + exportSubFolder + File.separator + fileName; String relativePath = FILE_FOLDER_NAME + File.separator + exportSubFolder + File.separator + fileName;
newContent = portableSkCase.addLocalFile(abstractFile.getName(), relativePath, abstractFile.getSize(), newContent = portableSkCase.addLocalFile(abstractFile.getName(), relativePath, abstractFile.getSize(),
abstractFile.getCtime(), abstractFile.getCrtime(), abstractFile.getAtime(), abstractFile.getMtime(), abstractFile.getCtime(), abstractFile.getCrtime(), abstractFile.getAtime(), abstractFile.getMtime(),
@ -1088,9 +1111,9 @@ public class PortableCaseReportModule implements ReportModule {
throw new TskCoreException("Trying to copy unexpected Content type " + content.getClass().getName()); // NON-NLS throw new TskCoreException("Trying to copy unexpected Content type " + content.getClass().getName()); // NON-NLS
} }
trans.commit(); trans.commit();
} catch (TskCoreException ex) { } catch (TskCoreException ex) {
trans.rollback(); trans.rollback();
throw(ex); throw (ex);
} }
} }
@ -1112,7 +1135,7 @@ public class PortableCaseReportModule implements ReportModule {
return UNKNOWN_FILE_TYPE_FOLDER; return UNKNOWN_FILE_TYPE_FOLDER;
} }
for (FileTypeCategory cat:FILE_TYPE_CATEGORIES) { for (FileTypeCategory cat : FILE_TYPE_CATEGORIES) {
if (cat.getMediaTypes().contains(abstractFile.getMIMEType())) { if (cat.getMediaTypes().contains(abstractFile.getMIMEType())) {
return cat.getDisplayName(); return cat.getDisplayName();
} }
@ -1153,7 +1176,6 @@ public class PortableCaseReportModule implements ReportModule {
configPanel = new CreatePortableCasePanel(); configPanel = new CreatePortableCasePanel();
return configPanel; return configPanel;
} */ } */
private class StoreMaxIdCallback implements CaseDbAccessManager.CaseDbAccessQueryCallback { private class StoreMaxIdCallback implements CaseDbAccessManager.CaseDbAccessQueryCallback {
private final String tableName; private final String tableName;
@ -1190,8 +1212,7 @@ public class PortableCaseReportModule implements ReportModule {
"# {0} - Temp folder path", "# {0} - Temp folder path",
"PortableCaseReportModule.compressCase.errorCreatingTempFolder=Could not create temporary folder {0}", "PortableCaseReportModule.compressCase.errorCreatingTempFolder=Could not create temporary folder {0}",
"PortableCaseReportModule.compressCase.errorCompressingCase=Error compressing case", "PortableCaseReportModule.compressCase.errorCompressingCase=Error compressing case",
"PortableCaseReportModule.compressCase.canceled=Compression canceled by user", "PortableCaseReportModule.compressCase.canceled=Compression canceled by user",})
})
private boolean compressCase(ReportProgressPanel progressPanel) { private boolean compressCase(ReportProgressPanel progressPanel) {
// Close the portable case database (we still need some of the variables that would be cleared by cleanup()) // Close the portable case database (we still need some of the variables that would be cleared by cleanup())
@ -1199,7 +1220,7 @@ public class PortableCaseReportModule implements ReportModule {
// Make a temporary folder for the compressed case // Make a temporary folder for the compressed case
File tempZipFolder = Paths.get(currentCase.getTempDirectory(), "portableCase" + System.currentTimeMillis()).toFile(); // NON-NLS File tempZipFolder = Paths.get(currentCase.getTempDirectory(), "portableCase" + System.currentTimeMillis()).toFile(); // NON-NLS
if (! tempZipFolder.mkdir()) { if (!tempZipFolder.mkdir()) {
handleError("Error creating temporary folder " + tempZipFolder.toString(), handleError("Error creating temporary folder " + tempZipFolder.toString(),
Bundle.PortableCaseReportModule_compressCase_errorCreatingTempFolder(tempZipFolder.toString()), null, progressPanel); // NON-NLS Bundle.PortableCaseReportModule_compressCase_errorCreatingTempFolder(tempZipFolder.toString()), null, progressPanel); // NON-NLS
return false; return false;
@ -1222,7 +1243,7 @@ public class PortableCaseReportModule implements ReportModule {
ProcessBuilder procBuilder = new ProcessBuilder(); ProcessBuilder procBuilder = new ProcessBuilder();
procBuilder.command( procBuilder.command(
sevenZipExe.getAbsolutePath(), sevenZipExe.getAbsolutePath(),
"a", // Add to archive "a", // Add to archive
zipFile.getAbsolutePath(), zipFile.getAbsolutePath(),
caseFolder.getAbsolutePath(), caseFolder.getAbsolutePath(),
chunkOption chunkOption

View File

@ -40,7 +40,7 @@ public class TimeLineModule {
private static final Logger logger = Logger.getLogger(TimeLineModule.class.getName()); private static final Logger logger = Logger.getLogger(TimeLineModule.class.getName());
private static final Object controllerLock = new Object(); private static final Object controllerLock = new Object();
private static TimeLineController controller; private static volatile TimeLineController controller;
/** /**
* provides static utilities, can not be instantiated * provides static utilities, can not be instantiated

View File

@ -28,8 +28,10 @@ import junit.framework.Test;
import org.apache.commons.io.FileUtils; import org.apache.commons.io.FileUtils;
import org.netbeans.junit.NbModuleSuite; import org.netbeans.junit.NbModuleSuite;
import org.openide.util.Exceptions;
import org.sleuthkit.autopsy.centralrepository.datamodel.CentralRepoAccount.CentralRepoAccountType; import org.sleuthkit.autopsy.centralrepository.datamodel.CentralRepoAccount.CentralRepoAccountType;
import org.sleuthkit.datamodel.Account; import org.sleuthkit.datamodel.Account;
import org.sleuthkit.datamodel.InvalidAccountIDException;
/** /**
* Tests the Account APIs on the Central Repository. * Tests the Account APIs on the Central Repository.
@ -145,7 +147,7 @@ public class CentralRepoAccountsTest extends TestCase {
// Create the account // Create the account
CentralRepository.getInstance() CentralRepository.getInstance()
.getOrCreateAccount(expectedAccountType, "+1 401-231-2552"); .getOrCreateAccount(expectedAccountType, "+1 401-231-2552");
} catch (CentralRepoException ex) { } catch (InvalidAccountIDException | CentralRepoException ex) {
Assert.fail("Didn't expect an exception here. Exception: " + ex); Assert.fail("Didn't expect an exception here. Exception: " + ex);
} }
} }
@ -167,7 +169,7 @@ public class CentralRepoAccountsTest extends TestCase {
Assert.assertEquals(expectedAccountType, actualAccount.getAccountType()); Assert.assertEquals(expectedAccountType, actualAccount.getAccountType());
Assert.assertEquals("+1 441-231-2552", actualAccount.getIdentifier()); Assert.assertEquals("+1 441-231-2552", actualAccount.getIdentifier());
} catch (CentralRepoException ex) { } catch (InvalidAccountIDException | CentralRepoException ex) {
Assert.fail("Didn't expect an exception here. Exception: " + ex); Assert.fail("Didn't expect an exception here. Exception: " + ex);
} }
} }

View File

@ -33,6 +33,7 @@ import org.apache.commons.io.FileUtils;
import org.netbeans.junit.NbModuleSuite; import org.netbeans.junit.NbModuleSuite;
import org.openide.util.Exceptions; import org.openide.util.Exceptions;
import org.sleuthkit.datamodel.Account; import org.sleuthkit.datamodel.Account;
import org.sleuthkit.datamodel.InvalidAccountIDException;
import org.sleuthkit.datamodel.TskData; import org.sleuthkit.datamodel.TskData;
@ -74,7 +75,7 @@ public class CentralRepoPersonasTest extends TestCase {
private static final String FACEBOOK_ID_CATDOG = "BalooSherkhan"; private static final String FACEBOOK_ID_CATDOG = "BalooSherkhan";
private static final String DOG_EMAIL_ID = "superpupper@junglebook.com"; private static final String DOG_EMAIL_ID = "superpupper@junglebook.com";
private static final String CAT_WHATSAPP_ID = "111 222 3333"; private static final String CAT_WHATSAPP_ID = "1112223333@s.whatsapp.net";
private static final String EMAIL_ID_1 = "rkipling@jungle.book"; private static final String EMAIL_ID_1 = "rkipling@jungle.book";
private static final String HOLMES_SKYPE_ID = "live:holmes@221baker.com"; private static final String HOLMES_SKYPE_ID = "live:holmes@221baker.com";
@ -383,7 +384,7 @@ public class CentralRepoPersonasTest extends TestCase {
// Confirm the account was removed // Confirm the account was removed
Assert.assertTrue(catPersona.getPersonaAccounts().isEmpty()); Assert.assertTrue(catPersona.getPersonaAccounts().isEmpty());
} catch (CentralRepoException ex) { } catch (InvalidAccountIDException | CentralRepoException ex) {
Assert.fail("Didn't expect an exception here. Exception: " + ex); Assert.fail("Didn't expect an exception here. Exception: " + ex);
} }
} }
@ -518,7 +519,7 @@ public class CentralRepoPersonasTest extends TestCase {
Assert.assertEquals(0, holmesMetadataList.size()); Assert.assertEquals(0, holmesMetadataList.size());
} catch (CentralRepoException ex) { } catch (InvalidAccountIDException | CentralRepoException ex) {
Assert.fail("Didn't expect an exception here. Exception: " + ex); Assert.fail("Didn't expect an exception here. Exception: " + ex);
} }
} }
@ -795,7 +796,7 @@ public class CentralRepoPersonasTest extends TestCase {
} }
catch (CentralRepoException | CorrelationAttributeNormalizationException ex) { catch (CentralRepoException | CorrelationAttributeNormalizationException | InvalidAccountIDException ex) {
Exceptions.printStackTrace(ex); Exceptions.printStackTrace(ex);
Assert.fail(ex.getMessage()); Assert.fail(ex.getMessage());
} }
@ -820,7 +821,7 @@ public class CentralRepoPersonasTest extends TestCase {
// Verify Persona has a default name // Verify Persona has a default name
Assert.assertEquals(Persona.getDefaultName(), persona.getName()); Assert.assertEquals(Persona.getDefaultName(), persona.getName());
} catch (CentralRepoException ex) { } catch (InvalidAccountIDException | CentralRepoException ex) {
Assert.fail("No name persona test failed. Exception: " + ex); Assert.fail("No name persona test failed. Exception: " + ex);
} }
} }
@ -893,7 +894,7 @@ public class CentralRepoPersonasTest extends TestCase {
Assert.assertEquals(4, personaSearchResult.size()); Assert.assertEquals(4, personaSearchResult.size());
} catch (CentralRepoException ex) { } catch (InvalidAccountIDException | CentralRepoException ex) {
Assert.fail("No name persona test failed. Exception: " + ex); Assert.fail("No name persona test failed. Exception: " + ex);
} }
} }
@ -1004,7 +1005,7 @@ public class CentralRepoPersonasTest extends TestCase {
Assert.assertEquals(6, personaSearchResult.size()); Assert.assertEquals(6, personaSearchResult.size());
} catch (CentralRepoException ex) { } catch (InvalidAccountIDException | CentralRepoException ex) {
Assert.fail("No name persona test failed. Exception: " + ex); Assert.fail("No name persona test failed. Exception: " + ex);
} }
} }
@ -1077,7 +1078,7 @@ public class CentralRepoPersonasTest extends TestCase {
Assert.assertEquals(0, accountsWithUnknownIdentifier.size()); Assert.assertEquals(0, accountsWithUnknownIdentifier.size());
} catch (CentralRepoException ex) { } catch (InvalidAccountIDException | CentralRepoException ex) {
Assert.fail("No name persona test failed. Exception: " + ex); Assert.fail("No name persona test failed. Exception: " + ex);
} }
} }

View File

@ -1,3 +1,59 @@
---------------- VERSION 4.16.0 --------------
Ingest:
- Added streaming ingest capability for disk images that allow files to be analyzed as soon as they are added to the database.
- Changed backend code so that disk image-based files are added by Java code instead of C/C++ code.
Ingest Modules:
- Include Interesting File set rules for cloud storage, encryption, cryptocurrency and privacy programs.
- Updated PhotoRec 7.1 and include 64-bit version.
- Updated RegRipper in Recent Activity to 2.8
- Create artifacts for Prefetch, Background Activity Monitor, and System Resource Usage.
- Support MBOX files greater than 2GB.
- Document metadata is saved as explicit artifacts and added to the timeline.
- New “no change” hashset type that does not change status of file.
Central Repository / Personas:
- Accounts in the Central Repository can be grouped together and associated with a digital persona.
- All accounts are now stored in the Central Repository to support correlation and persona creation.
Content viewers:
- Created artifact-specific viewers in the Results viewer for contact book and call log.
- Moved Message viewer to a Results sub-viewer and expanded to show accounts.
- Added Application sub-viewer for PDF files based on IcePDF.
- Annotation viewer now includes comments from hash set hits.
Geolocation Viewer:
- Different data types now are displayed using different colors.
- Track points in a track are now displayed as small, connected circles instead of full pins.
- Filter panel shows only data sources with geo location data.
- Geolocation artifact points can be tagged and commented upon.
File Discovery:
- Changed UI to have more of a search flow and content viewer is hidden until an item is selected.
Reports:
- Can be generated for a single data source instead of the entire case.
- CASE / UCO report module now includes artifacts in addition to files.
- Added backend concept of Tag Sets to support Project Vic categories from different countries.
Performance:
- Add throttling of UI refreshes to ensure data is quickly displayed and the tree does not get backed up with requests.
- Improved efficiency of adding a data source with many orphan files.
- Improved efficiency of loading file systems.
- Jython interpreter is preloaded at application startup.
Misc bug fixes and improvements:
- Fixed bug from last release where hex content viewer text was no longer fixed width.
- Altered locking to allow multiple data sources to be added at once more smoothly and to support batch inserts of file data.
- Central repository comments will no longer store tag descriptions.
- Account type nodes in the Accounts tree show counts.
- Full time stamps displayed for messages in ingest inbox.
- More detailed status during file exports.
- Improved efficiency of adding timeline events.
- Fixed bug with CVT most recent filter.
- Improved documentation and support for running on Linux/macOS.
---------------- VERSION 4.15.0 -------------- ---------------- VERSION 4.15.0 --------------
New UI Features: New UI Features:
- Added Document view to File Discovery. - Added Document view to File Discovery.

View File

@ -380,7 +380,7 @@ final class ExtractPrefetch extends Extract {
for (AbstractFile pFile : files) { for (AbstractFile pFile : files) {
if (pFile.getParentPath().toLowerCase().contains(filePath.toLowerCase())) { if (pFile.getParentPath().toLowerCase().endsWith(filePath.toLowerCase() + '/')) {
return pFile; return pFile;
} }
} }

View File

@ -1,3 +1,3 @@
<project name="TSK_VERSION"> <project name="TSK_VERSION">
<property name="TSK_VERSION" value="4.9.0"/> <property name="TSK_VERSION" value="4.10.0"/>
</project> </project>

View File

@ -23,6 +23,7 @@ environment:
PYTHON: "C:\\Python36-x64" PYTHON: "C:\\Python36-x64"
install: install:
- ps: choco install nuget.commandline
- ps: choco install ant --ignore-dependencies - ps: choco install ant --ignore-dependencies
- git clone https://github.com/sleuthkit/sleuthkit - git clone https://github.com/sleuthkit/sleuthkit
- ps: $env:Path="C:\Program Files\Java\jdk1.8.0\bin;$($env:Path);C:\ProgramData\chocolatey\lib\ant" - ps: $env:Path="C:\Program Files\Java\jdk1.8.0\bin;$($env:Path);C:\ProgramData\chocolatey\lib\ant"
@ -36,6 +37,7 @@ services:
build_script: build_script:
- cd %TSK_HOME% - cd %TSK_HOME%
- nuget restore win32\libtsk -PackagesDirectory win32\packages
- python setupDevRepos.py - python setupDevRepos.py
- python win32\updateAndBuildAll.py -m - python win32\updateAndBuildAll.py -m
- ps: pushd bindings/java - ps: pushd bindings/java

View File

@ -1,5 +1,5 @@
#Updated by build script #Updated by build script
#Fri, 19 Jun 2020 10:14:47 -0400 #Wed, 08 Jul 2020 15:15:46 -0400
LBL_splash_window_title=Starting Autopsy LBL_splash_window_title=Starting Autopsy
SPLASH_HEIGHT=314 SPLASH_HEIGHT=314
SPLASH_WIDTH=538 SPLASH_WIDTH=538
@ -8,4 +8,4 @@ SplashRunningTextBounds=0,289,538,18
SplashRunningTextColor=0x0 SplashRunningTextColor=0x0
SplashRunningTextFontSize=19 SplashRunningTextFontSize=19
currentVersion=Autopsy 4.15.0 currentVersion=Autopsy 4.16.0

View File

@ -1,4 +1,4 @@
#Updated by build script #Updated by build script
#Fri, 19 Jun 2020 10:14:47 -0400 #Wed, 08 Jul 2020 15:15:46 -0400
CTL_MainWindow_Title=Autopsy 4.15.0 CTL_MainWindow_Title=Autopsy 4.16.0
CTL_MainWindow_Title_No_Project=Autopsy 4.15.0 CTL_MainWindow_Title_No_Project=Autopsy 4.16.0

View File

@ -218,12 +218,6 @@ the Case -> Case Properties menu.
This shows how common the selected file is. The value is the percentage of case/data source tuples that have the selected property. This shows how common the selected file is. The value is the percentage of case/data source tuples that have the selected property.
\subsection central_repo_comment Add/Edit Comment
If you want instead to edit the comment of a node, it can be done by right clicking on the original item in the result viewer and selecting "Add/Edit Central Repository Comment".
\image html central_repo_comment_menu.png
\subsection cr_interesting_items Interesting Items \subsection cr_interesting_items Interesting Items
In the Results tree of an open case is an entry called Interesting Items. When this module is enabled, all of the enabled In the Results tree of an open case is an entry called Interesting Items. When this module is enabled, all of the enabled

View File

@ -3,14 +3,15 @@
What Does It Do What Does It Do
======== ========
The Hash Lookup Module calculates MD5 hash values for files and looks up hash values in a database to determine if the file is notable, known (in general), or unknown. The Hash Lookup Module calculates MD5 hash values for files and looks up hash values in a database to determine if the file is notable, known (in general), included in a specific set of files, or unknown.
Configuration Configuration
======= =======
The Hash Sets tab on the Options panel is where you can set and update your hash set information. Hash sets are used to identify files that are 'known' or 'notable'. The Hash Sets tab on the Options panel is where you can set and update your hash set information. Hash sets are used to identify files that are 'known', 'notable', or 'no change'.
\li Known good files are those that can be safely ignored. This set of files frequently includes standard OS and application files. Ignoring such uninteresting-to-the-investigator files, can greatly reduce image analysis time. \li Known good files are those that can be safely ignored. This set of files frequently includes standard OS and application files. Ignoring such uninteresting-to-the-investigator files, can greatly reduce image analysis time.
\li Notable (or known bad) files are those that should raise awareness. This set will vary depending on the type of investigation, but common examples include contraband images and malware. \li Notable (or known bad) files are those that should raise awareness. This set will vary depending on the type of investigation, but common examples include contraband images and malware.
\li No change files are files that can reveal information about the system but are not notable. For example, knowning an image contains many files known to be maps of London could be interesting to an investigator, but the maps themselves are not notable.
\section adding_hashsets Importing Hash Sets \section adding_hashsets Importing Hash Sets
@ -35,7 +36,7 @@ To import an existing hash set, use the "Import Database" button on the Hash Set
<b>Source Organization</b> - The organization can only be entered when importing the hash set into the central repository. See the section on \ref cr_manage_orgs "managing organizations" for more information. <b>Source Organization</b> - The organization can only be entered when importing the hash set into the central repository. See the section on \ref cr_manage_orgs "managing organizations" for more information.
<b>Type of database</b> - All entries in the hash set should either be "known" (can be safely ignored) or "notable" (could be indicators of suspicious behavior). <b>Type of database</b> - All entries in the hash set should either be "known" (can be safely ignored), "notable" (could be indicators of suspicious behavior), or "no change" (known to be a certain type of file).
<b>Make database read-only</b> - The read-only setting is only active when importing the hash set into the central repository. A read-only database can not have new hashes added to it through either the Hash Sets options panel or the context menu. For locally imported hash sets, whether they can be written to is dependent on the type of hash set. Autopsy format databases (*.kdb) can be edited, but all other types will be read-only. <b>Make database read-only</b> - The read-only setting is only active when importing the hash set into the central repository. A read-only database can not have new hashes added to it through either the Hash Sets options panel or the context menu. For locally imported hash sets, whether they can be written to is dependent on the type of hash set. Autopsy format databases (*.kdb) can be edited, but all other types will be read-only.
@ -52,7 +53,7 @@ After importing the hash set, you may have to index it before it can be used. Fo
Autopsy uses the hash set management system from The Sleuth Kit. You can manually create an index using the 'hfind' command line tool or you can use Autopsy. If you attempt proceed without indexing a hash set, Autopsy will offer to automatically produce an index for you. Autopsy uses the hash set management system from The Sleuth Kit. You can manually create an index using the 'hfind' command line tool or you can use Autopsy. If you attempt proceed without indexing a hash set, Autopsy will offer to automatically produce an index for you.
You can also specify only the index file and not use the full hash set - the index file is sufficient to identify known files. This can save space. To do this, specify the .idx file from the Hash Sets option panel. You can also specify only the index file and not use the full hash set - the index file is sufficient to identify known files. This can save space. To do this, specify the .idx file from the Hash Sets option panel.
\section creating_hashsets Creating Hash sets \section creating_hashsets Creating Hash Sets
New hash sets can be created using the "New Hash Set" button. The fields are mostly the same as the \ref adding_hashsets "import dialog" described above. New hash sets can be created using the "New Hash Set" button. The fields are mostly the same as the \ref adding_hashsets "import dialog" described above.
@ -60,6 +61,15 @@ New hash sets can be created using the "New Hash Set" button. The fields are mos
In this case, the Database Path is where the new database will be stored. If the central repository is being used then this field is not needed. In this case, the Database Path is where the new database will be stored. If the central repository is being used then this field is not needed.
\section hash_adding_hashes Adding Hashes to a Hash Set
Once you've created a hash set you'll need to add hashes two it. The first way to do this is using the "Add Hashes to Hash Set" button on the options panel. Each hash should be on its own line, and can optionally be followed by a comma and then a comment about the file that hash corresponds to. Here we are creating a "no change" hash set corresponding to cat images:
\image html hash_add.png
The other way to add an entry to a hash set is through the context menu. Highlight the file you want to add to a hash set in the result viewer and right-click, then select "Add file to hash set" and finally the set you want to add it to. Note that this does not automatically add the file to the list of hash set hits for the current case - you will have to re-run the Hash Lookup ingest module to see it appear there.
\image html hash_add_context.png
\section using_hashsets Using Hash Sets \section using_hashsets Using Hash Sets
There is an \ref ingest_page "ingest module" that will hash the files and look them up in the hash sets. It will flag files that were in the notable hash set and those results will be shown in the Results tree of the \ref tree_viewer_page. There is an \ref ingest_page "ingest module" that will hash the files and look them up in the hash sets. It will flag files that were in the notable hash set and those results will be shown in the Results tree of the \ref tree_viewer_page.
@ -90,8 +100,10 @@ When hash sets are configured, the user can select the hash sets to use during t
Seeing Results Seeing Results
------ ------
Results show up in the tree as "Hashset Hits", grouped by the name of the hash set. Results show up in the tree as "Hashset Hits", grouped by the name of the hash set. If the hash set hits had associated comments, you will see them in the "Comment" column in the result viewer along with the file hash.
\image html hashset-hits.PNG \image html hashset-hits.PNG
You can also view the comments on the "Annotation" tab of the content viewer.
*/ */

Binary file not shown.

After

Width:  |  Height:  |  Size: 19 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 64 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 71 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 9.9 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 24 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.5 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 57 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 10 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 29 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 62 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 18 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 47 KiB

After

Width:  |  Height:  |  Size: 46 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 30 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 22 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 25 KiB

After

Width:  |  Height:  |  Size: 22 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 18 KiB

After

Width:  |  Height:  |  Size: 18 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 10 KiB

After

Width:  |  Height:  |  Size: 293 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 11 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 15 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 168 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 148 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 77 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 6.6 KiB

View File

@ -71,6 +71,7 @@ The following topics are available here:
- \subpage communications_page - \subpage communications_page
- \subpage geolocation_page - \subpage geolocation_page
- \subpage discovery_page - \subpage discovery_page
- \subpage personas_page
- Reporting - Reporting
- \subpage tagging_page - \subpage tagging_page

View File

@ -0,0 +1,107 @@
/*! \page personas_page Personas
\section personas_overview Overview
Autopsy can store and organize account information based on personas, which represent an online identity. A person may have several online identities and therefore several personas. As an example, a single person may have a set of accounts that post online about loving cats and another set of accounts that appear unrelated that post about hating cats.
\section personas_concepts Concepts
Here are some basic concepts about persona:
<ul>
<li>To make a persona, you need to have a name and at least one account.
<li>A persona may have multiple accounts - a phone number, an email, a Facebook account, a Skype account, etc.
<li>Personas span cases and therefore are stored in the \ref central_repo_page "Central Repository".
<li>You can manually create a persona or make one based on a contact book entry, or a call log, or a message.
<li>An account may be part of multiple personas. This can happen if someone uses the same cell phone as a recovery number for accounts on multiple personas.
<li>A persona may have additional metadata associated with it as name/value pairs.
<li>A persona may have one or more aliases.
<li>Autopsy will show you if an account is part of a persona, where applicable.
</ul>
Personas are stored in the Central Repository based on accounts that were found in results. These results are generated by various ingest modules such as the \ref recent_activity_page and \ref android_analyzer_page.
Autopsy provides a dedicated tool, \ref personas_editor "Personas Editor", to create, view, edit, and delete personas.
\section personas_editor Personas Editor
The Personas Editor is loaded through the Tools -> Personas menu item.
The left panel in the Personas Editor is a table that lists personas, based on the selected criteria. The right panel displays the details of selected persona.
By default, when the Personas Editor is launched, all the personas in the Central Repository are listed in the table. You may filter this list by checking the "Filter personas by Keyword" checkbox. Type in either a persona name or an account identifier in the textbox and select the "Name" or "Account" radio button appropriately. Then click the "Show" button to show only the personas that match the filtering criteria.
\image html Personas/personas_main.png
\subsection personas_create Create Personas
To create a new persona, click the "New Persona" button. A "Create Persona" dialog box will pop up. The following is a description of each field:
<ul>
<li><b>Created by</b>: Will be automatically filled in with the current user
<li><b>Created on</b>: Will be automatically filled in after saving the persona
<li><b>Comment</b>: A description of the persona
<li><b>Name</b>: The name of the persona
<li><b>Accounts</b>: At least one account belonging to the persona
<li><b>Metadata</b>: (Optional) Name/value pairs of data related to the persona
<li><b>Aliases</b>: (Optional) Any aliases for this persona
<li><b>Cases found in</b>: Will be automatically filled in when editing a persona
</ul>
Each persona needs at least one account associated with it. These accounts must have been previously saved to the \ref central_repo_page "central repository". Clicking "Add" under "Accounts" will bring up another dialog with four fields, all required:
<ul>
<li><b>Identifier</b>: The identifier for the account (phone number, email address, Facebook account id, etc)
<li><b>Type</b>: The type of identifier
<li><b>Confidence</b>: General confidence that this account goes with the given persona
<li><b>Justification</b>: Why this account is being added to the persona
</ul>
\image html Personas/personas_create.png
When finished adding at least one account and filling in the required fields, click on OK to create the persona. A persona with the specified name will be created and associated with the specified account(s).
\subsection personas_edit Edit Personas
To edit a persona, click the "Edit Persona" button. You'll be able to edit all the data about the persona.
\image html Personas/persona_edit.png
\subsection personas_delete Delete Personas
To delete a persona, select the persona in the table and click on the "Delete Persona" button. Click "Yes" on confirmation dialog to delete the selected persona.
\subsection personas_account_create Create Account
All personas must be associated with at least one account. Normally these account will be added to the central repository by various ingest modules, but you can also create them manually with the "Create Account" button.
\image html Personas/personas_create_account.png
\section personas_artifact_viewers Persona integration in Content Viewers
Autopsy shows persona associated with accounts, where applicable. When viewing contact, call log and message results, Autopsy shows the personas associated with accounts in these panels. If no persona exists for an account, Autopsy provides a button for the user to create one.
As shown below, when viewing a contact result you may see persona data. When one or more personas are found associated with the accounts in the result then the Persona name is shown in the contact content viewer. There will be a "View" button to see the details of the persona.
\image html Personas/personas_contact_found.png
If no matching persona is found, a "Create" button is shown to create a persona for the account(s). This will bring you to the \ref personas_create "Create Personas" panel with the account(s) already added.
\image html Personas/personas_contact_not_found.png
Personas are integrated similarly in the content viewers for call logs and messages/e-mail.
\image html Personas/personas_calllog.png
<br>
\image html Personas/personas_message.png
\section personas_cvt Persona Integration in the Communications Visualization Tool
Personas are integrated with the \ref communications_page. When viewing accounts in the Accounts browsers in the Communications Visualization Tool, associated persona information is shown in the tooltip if you hover over the account.
\image html Personas/personas_cvt_hover.png
As in the Autopsy main window, you may also create or view personas when examining contacts, call logs, and messages in the Communications Visualization Tool.
\image html Personas/personas_cvt_accounts.png
*/

View File

@ -22,8 +22,7 @@ The next step is to add an input data source to the case. The <strong>Add Data S
- For local disk, select one of the detected disks. Autopsy will add the current view of the disk to the case (i.e. snapshot of the meta-data). However, the individual file content (not meta-data) does get updated with the changes made to the disk. You can optionally create a copy of all data read from the local disk to a VHD file, which can be useful for triage situations. Note, you may need run Autopsy as an Administrator to detect all disks. - For local disk, select one of the detected disks. Autopsy will add the current view of the disk to the case (i.e. snapshot of the meta-data). However, the individual file content (not meta-data) does get updated with the changes made to the disk. You can optionally create a copy of all data read from the local disk to a VHD file, which can be useful for triage situations. Note, you may need run Autopsy as an Administrator to detect all disks.
- For logical files (a single file or folder of files), use the "Add" button to add one or more files or folders on your system to the case. Folders will be recursively added to the case. - For logical files (a single file or folder of files), use the "Add" button to add one or more files or folders on your system to the case. Folders will be recursively added to the case.
Next it will prompt you to configure the Ingest Modules.
After supplying the needed data, Autopsy will quickly review the data sources and add minimal metadata to the case databases so that it can schedule the files for analysis. While it is doing that, it will prompt you to configure the Ingest Modules.
\subsection s1c Ingest Modules \subsection s1c Ingest Modules
@ -35,18 +34,21 @@ The standard ingest modules included with Autopsy are:
- <strong>\subpage recent_activity_page</strong> extracts user activity as saved by web browsers and the OS. Also runs Regripper on the registry hive. - <strong>\subpage recent_activity_page</strong> extracts user activity as saved by web browsers and the OS. Also runs Regripper on the registry hive.
- <strong>\subpage hash_db_page</strong> uses hash sets to ignore known files from the NIST NSRL and flag known bad files. Use the "Advanced" button to add and configure the hash sets to use during this process. You will get updates on known bad file hits as the ingest occurs. You can later add hash sets via the Tools -&gt; Options menu in the main UI. You can download an index of the NIST NSRL from http://sourceforge.net/projects/autopsy/files/NSRL/ - <strong>\subpage hash_db_page</strong> uses hash sets to ignore known files from the NIST NSRL and flag known bad files. Use the "Advanced" button to add and configure the hash sets to use during this process. You will get updates on known bad file hits as the ingest occurs. You can later add hash sets via the Tools -&gt; Options menu in the main UI. You can download an index of the NIST NSRL from http://sourceforge.net/projects/autopsy/files/NSRL/
- <strong>\subpage file_type_identification_page</strong> determines file types based on signatures and reports them based on MIME type. It stores the results in the Blackboard and many modules depend on this. It uses the Tika open source library. You can define your own custom file types in Tools, Options, File Types. - <strong>\subpage file_type_identification_page</strong> determines file types based on signatures and reports them based on MIME type. It stores the results in the Blackboard and many modules depend on this. It uses the Tika open source library. You can define your own custom file types in Tools, Options, File Types.
- <strong>\subpage extension_mismatch_detector_page</strong> uses the results from the File Type Identification and flags files that have an extension not traditionally associated with the file's detected type. Ignores 'known' (NSRL) files. You can customize the MIME types and file extensions per MIME type in Tools, Options, File Extension Mismatch.
- <strong>\subpage embedded_file_extractor_page</strong> opens ZIP, RAR, other archive formats, Doc, Docx, PPT, PPTX, XLS, and XLSX and sends the derived files from those files back through the ingest pipeline for analysis. - <strong>\subpage embedded_file_extractor_page</strong> opens ZIP, RAR, other archive formats, Doc, Docx, PPT, PPTX, XLS, and XLSX and sends the derived files from those files back through the ingest pipeline for analysis.
- <strong>\subpage EXIF_parser_page</strong> extracts EXIF information from JPEG files and posts the results into the tree in the main UI. - <strong>\subpage EXIF_parser_page</strong> extracts EXIF information from JPEG files and posts the results into the tree in the main UI.
- <strong>\subpage keyword_search_page</strong> uses keyword lists to identify files with specific words in them. You can select the keyword lists to search for automatically and you can create new lists using the "Advanced" button. Note that with keyword search, you can always conduct searches after ingest has finished. The keyword lists that you select during ingest will be searched for at periodic intervals and you will get the results in real-time. You do not need to wait for all files to be indexed before performing a keyword search, however you will only get results from files that have already been indexed when you perform your search. - <strong>\subpage keyword_search_page</strong> uses keyword lists to identify files with specific words in them. You can select the keyword lists to search for automatically and you can create new lists using the "Advanced" button. Note that with keyword search, you can always conduct searches after ingest has finished. The keyword lists that you select during ingest will be searched for at periodic intervals and you will get the results in real-time. You do not need to wait for all files to be indexed before performing a keyword search, however you will only get results from files that have already been indexed when you perform your search.
- <strong>\subpage email_parser_page</strong> identifies Thunderbird MBOX files and PST format files based on file signatures, extracting the e-mails from them, adding the results to the Blackboard. - <strong>\subpage email_parser_page</strong> identifies Thunderbird MBOX files and PST format files based on file signatures, extracting the e-mails from them, adding the results to the Blackboard.
- <strong>\subpage extension_mismatch_detector_page</strong> uses the results from the File Type Identification and flags files that have an extension not traditionally associated with the file's detected type. Ignores 'known' (NSRL) files. You can customize the MIME types and file extensions per MIME type in Tools, Options, File Extension Mismatch.
- <strong>\subpage data_source_integrity_page</strong> computes a checksum on E01 files and compares with the E01 file's internal checksum to ensure they match.
- <strong>\subpage android_analyzer_page</strong> allows you to parse common items from Android devices. Places artifacts into the BlackBoard.
- <strong>\subpage interesting_files_identifier_page</strong> searches for files and directories based on user-specified rules in Tools, Options, Interesting Files. It works as a "File Alerting Module". It generates messages in the inbox when specified files are found.
- <strong>\subpage photorec_carver_page</strong> carves files from unallocated space and sends them through the file processing chain.
- <strong>\subpage cr_ingest_module</strong> adds file hashes and other extracted properties to a central repository for future correlation and to flag previously notable files.
- <strong>\subpage encryption_page</strong> looks for encrypted files. - <strong>\subpage encryption_page</strong> looks for encrypted files.
- <strong>\subpage interesting_files_identifier_page</strong> searches for files and directories based on user-specified rules in Tools, Options, Interesting Files. It works as a "File Alerting Module". It generates messages in the inbox when specified files are found.
- <strong>\subpage cr_ingest_module</strong> adds file hashes and other extracted properties to a central repository for future correlation and to flag previously notable files.
- <strong>\subpage photorec_carver_page</strong> carves files from unallocated space and sends them through the file processing chain.
- <strong>\subpage vm_extractor_page</strong> extracts data from virtual machine files - <strong>\subpage vm_extractor_page</strong> extracts data from virtual machine files
- <strong>\subpage data_source_integrity_page</strong> computes a checksum on E01 files and compares with the E01 file's internal checksum to ensure they match.
- <strong>\subpage drone_page</strong> extracts data from drone files.
- <strong>\subpage plaso_page</strong> uses Plaso to create \ref timeline_page "timeline" events.
- <strong>\subpage android_analyzer_page</strong> allows you to parse common items from Android devices. Places artifacts into the BlackBoard.
- <strong>\subpage gpx_page</strong> extracts geolocation data from .gpx files.
When you select a module, you will have the option to change its settings. For example, you can configure which keyword search lists to use during ingest and which hash sets to use. Refer to the individual module help for details on configuring each module. When you select a module, you will have the option to change its settings. For example, you can configure which keyword search lists to use during ingest and which hash sets to use. Refer to the individual module help for details on configuring each module.

View File

@ -8,6 +8,10 @@ of any coordinates found to load into software like Google Earth.
\image html reports_select.png \image html reports_select.png
Most report types will allow you to select which data sources to include in the report. Note that the names of excluded data sources may still be present in the report. For example, the \ref report_html will list all data sources in the case on the main page but will not contain results, tagged files, etc. from the excluded data source(s).
\image html reports_datasource_select.png
The different types of reports will be described below. The majority of the report modules will generate a report file which The different types of reports will be described below. The majority of the report modules will generate a report file which
will be displayed in the case under the "Reports" node of the \ref tree_viewer_page. will be displayed in the case under the "Reports" node of the \ref tree_viewer_page.

View File

@ -1,6 +1,6 @@
/*! \page tagging_page Tagging /*! \page tagging_page Tagging and Commenting
Tagging (or Bookmarking) allows you to create a reference to a file or object and easily find it later or include it in a \ref reporting_page "report". Tagging is also used by the \ref central_repo_page "central repository" to mark items as notable. Tagging (or Bookmarking) allows you to create a reference to a file or object and easily find it later or include it in a \ref reporting_page "report". Tagging is also used by the \ref central_repo_page "central repository" to mark items as notable. You can add comments to files and results using tags or through the central repository.
\section tagging_items Tagging items \section tagging_items Tagging items
@ -99,7 +99,7 @@ If using the central repository, changing the notable status will effect tagged
- If "File A" is tagged with "Tag A", which is not notable, and then "Tag A" is switched to notable, "File A" will be marked as notable in the central repository - If "File A" is tagged with "Tag A", which is not notable, and then "Tag A" is switched to notable, "File A" will be marked as notable in the central repository
- If "File B" is tagged with "Tag B", which is notable, and then "Tag B" is switched to non-notable, if there are no other notable tags on "File B" then its notable status in the central repository will be removed. - If "File B" is tagged with "Tag B", which is notable, and then "Tag B" is switched to non-notable, if there are no other notable tags on "File B" then its notable status in the central repository will be removed.
\section user_tags Hiding tags from other users \subsection user_tags Hiding tags from other users
Tags are associated with the account name of the user that tagged them. This information is visible through selecting items under the "Tags" section of the directory tree: Tags are associated with the account name of the user that tagged them. This information is visible through selecting items under the "Tags" section of the directory tree:
@ -113,4 +113,26 @@ It is possible to hide all tagged files and results in the "Tags" area of the tr
\image html tagging_view_options.png \image html tagging_view_options.png
\section tagging_commenting Commenting
There are two methods to adding comments to files and results. The first method was discussed in the \ref tagging_items section. Right click on the file or result of interest, choose "Add File Tag" or "Add Result Tag" and then "Tag and Comment". This allows you to add a comment about the item. You can add multiple tags with comments to the same file or result.
\image html tagging_comment_context.png
If you have a \ref central_repo_page "central repository" enabled, you can also use it to save comments about files. Right click on the file and select "Add/Edit Central Repository Comment". If there was already a comment for this file it will appear in the dialog and can be changed - only one central repository comment can be stored at a time.
\image html tagging_cr_comment.png
If a file or result has a comment associated with it, you'll see a notepad icon in the "C" column of the result viewer. Hovering over it will tell you what type of comments are on the item.
\image html tagging_comment_icon.png
You can view comments associated with tags by going to the "Tags" section of the tree viewer and selecting one of your tags. Any comments will appear in the "Comment" column in the results viewer.
\image html tagging_comment_in_result_viewer.png
You can view all comments on an item through the "Annotation" tab in the content viewer.
\image html tagging_comment_anno.png
*/ */

View File

@ -1,220 +0,0 @@
"""This script determines the updated, added, and deleted properties from the '.properties-MERGED' files
and generates a csv file containing the items changed. This script requires the python libraries:
gitpython and jproperties. As a consequence, it also requires git >= 1.7.0 and python >= 3.4.
"""
from git import Repo
from typing import List, Dict, Tuple
import re
import csv
from jproperties import Properties
import sys
class ItemChange:
def __init__(self, rel_path: str, key: str, prev_val: str, cur_val: str):
"""Describes the change that occurred for a particular key of a properties file.
Args:
rel_path (str): The relative path of the properties file.
key (str): The key in the properties file.
prev_val (str): The previous value for the key.
cur_val (str): The current value for the key.
"""
self.rel_path = rel_path
self.key = key
self.prev_val = prev_val
self.cur_val = cur_val
if ItemChange.has_str_content(cur_val) and not ItemChange.has_str_content(prev_val):
self.type = 'ADDITION'
elif not ItemChange.has_str_content(cur_val) and ItemChange.has_str_content(prev_val):
self.type = 'DELETION'
else:
self.type = 'CHANGE'
@staticmethod
def has_str_content(content: str):
"""Determines whether or not the content is empty or None.
Args:
content (str): The text.
Returns:
bool: Whether or not it has content.
"""
return content is not None and len(content.strip()) > 0
@staticmethod
def get_headers() -> List[str]:
"""Returns the csv headers to insert when serializing a list of ItemChange objects to csv.
Returns:
List[str]: The column headers
"""
return ['Relative Path', 'Key', 'Change Type', 'Previous Value', 'Current Value']
def get_row(self) -> List[str]:
"""Returns the list of values to be entered as a row in csv serialization.
Returns:
List[str]: The list of values to be entered as a row in csv serialization.
"""
return [
self.rel_path,
self.key,
self.type,
self.prev_val,
self.cur_val]
def get_entry_dict(diff_str: str) -> Dict[str, str]:
"""Retrieves a dictionary mapping the properties represented in the string.
Args:
diff_str (str): The string of the properties file.
Returns:
Dict[str,str]: The mapping of keys to values in that properties file.
"""
props = Properties()
props.load(diff_str, "utf-8")
return props.properties
def get_item_change(rel_path: str, key: str, prev_val: str, cur_val: str) -> ItemChange:
"""Returns an ItemChange object if the previous value is not equal to the current value.
Args:
rel_path (str): The relative path for the properties file.
key (str): The key within the properties file for this potential change.
prev_val (str): The previous value.
cur_val (str): The current value.
Returns:
ItemChange: The ItemChange object or None if values are the same.
"""
if (prev_val == cur_val):
return None
else:
return ItemChange(rel_path, key, prev_val, cur_val)
def get_changed(rel_path: str, a_str: str, b_str: str) -> List[ItemChange]:
"""Given the relative path of the properties file that
Args:
rel_path (str): The relative path for the properties file.
a_str (str): The string representing the original state of the file.
b_str (str): The string representing the current state of the file.
Returns:
List[ItemChange]: The changes determined.
"""
print('Retrieving changes for {}...'.format(rel_path))
a_dict = get_entry_dict(a_str)
b_dict = get_entry_dict(b_str)
all_keys = set().union(a_dict.keys(), b_dict.keys())
mapped = map(lambda key: get_item_change(
rel_path, key, a_dict.get(key), b_dict.get(key)), all_keys)
return filter(lambda entry: entry is not None, mapped)
def get_text(blob) -> str:
return blob.data_stream.read().decode('utf-8')
def get_changed_from_diff(rel_path: str, diff) -> List[ItemChange]:
"""Determines changes from a git python diff.
Args:
rel_path (str): The relative path for the properties file.
diff: The git python diff.
Returns:
List[ItemChange]: The changes in properties.
"""
# an item was added
if diff.change_type == 'A':
changes = get_changed(rel_path, '', get_text(diff.b_blob))
# an item was deleted
elif diff.change_type == 'D':
changes = get_changed(rel_path, get_text(diff.a_blob), '')
# an item was modified
elif diff.change_type == 'M':
changes = get_changed(rel_path, get_text(
diff.a_blob), get_text(diff.b_blob))
else:
changes = []
return changes
def get_rel_path(diff) -> str:
"""Determines the relative path based on the git python.
Args:
diff: The git python diff.
Returns:
str: The determined relative path.
"""
if diff.b_path is not None:
return diff.b_path
elif diff.a_path is not None:
return diff.a_path
else:
return '<Uknown Path>'
def write_diff_to_csv(repo_path: str, output_path: str, commit_1_id: str, commit_2_id: str):
"""Determines the changes made in '.properties-MERGED' files from one commit to another commit.
Args:
repo_path (str): The local path to the git repo.
output_path (str): The output path for the csv file.
commit_1_id (str): The initial commit for the diff.
commit_2_id (str): The latest commit for the diff.
"""
repo = Repo(repo_path)
commit_1 = repo.commit(commit_1_id)
commit_2 = repo.commit(commit_2_id)
diffs = commit_1.diff(commit_2)
with open(output_path, 'w', newline='') as csvfile:
writer = csv.writer(csvfile)
writer.writerow(ItemChange.get_headers())
for diff in diffs:
rel_path = get_rel_path(diff)
if not rel_path.endswith('.properties-MERGED'):
continue
changes = get_changed_from_diff(rel_path, diff)
for item_change in changes:
writer.writerow(item_change.get_row())
def print_help():
"""Prints a quick help message.
"""
print("diffscript.py [path to repo] [csv output path] [commit for previous release] [commit for current release (optional; defaults to 'HEAD')]")
def main():
if len(sys.argv) <= 3:
print_help()
sys.exit(1)
repo_path = sys.argv[1]
output_path = sys.argv[2]
commit_1_id = sys.argv[3]
commit_2_id = sys.argv[4] if len(sys.argv) > 4 else 'HEAD'
write_diff_to_csv(repo_path, output_path, commit_1_id, commit_2_id)
sys.exit(0)
if __name__ == "__main__":
main()

View File

@ -0,0 +1,2 @@
__pycache__
.idea

View File

@ -0,0 +1,20 @@
## Description
This folder provides tools to handle updates of bundle files for language localization. There are three main scripts:
- `allbundlesscript.py` - generates a csv file containing the relative path of the bundle file, the key, and the value for each property.
- `diffscript.py` - determines the property values that have changed between two commits and generates a csv file containing the relative path, the key, the previous value, the new value, and the change type (addition, deletion, change).
- `updatepropsscript.py` - Given a csv file containing the relative path of the bundle, the key, and the new value, will update the property values for a given language within the project.
All of these scripts provide more details on usage by calling the script with `-h`.
## Basic Localization Update Workflow
1. Call `python3 diffscript.py <output path> -l <language>` to generate a csv file containing differences in properties file values from the language's previous commit to the `HEAD` commit. The language identifier should be the abbreviated identifier used for the bundle (i.e. 'ja' for Japanese). The output path should be specified as a relative path with the dot slash notation (i.e. `./outputpath.csv`) or an absolute path.
2. Update csv file with translations
3. Call `python3 updatepropsscript.py <input path> -l <language>` to update properties files based on the newly generated csv file. The csv file should be formatted such that the columns are bundle relative path, property files key, translated value and commit id for the latest commit id for which these changes represent. The commit id only needs to be in the header row. The output path should be specified as a relative path with the dot slash notation (i.e. `./outputpath.csv`) or an absolute path.
## Localization Generation for the First Time
First-time updates should follow a similar procedure except that instead of calling `diffscript.py`, call `python3 allbundlesscript <output path>` to generate a csv file with relative paths of bundle files, property file keys, property file values. The output path should be specified as a relative path with the dot slash notation (i.e. `./inputpath.csv`) or an absolute path.
##Unit Tests
Unit tests can be run from this directory using `python3 -m unittest`.

View File

@ -0,0 +1,73 @@
"""This script finds all '.properties-MERGED' files and writes relative path, key, and value to a CSV file.
This script requires the python libraries: gitpython and jproperties. As a consequence, it also requires
git >= 1.7.0 and python >= 3.4. This script relies on fetching 'HEAD' from current branch. So make sure
repo is on correct branch (i.e. develop).
"""
import sys
from envutil import get_proj_dir
from fileutil import get_filename_addition, OMITTED_ADDITION
from gitutil import get_property_file_entries, get_commit_id, get_git_root
from csvutil import records_to_csv
from typing import Union
import re
import argparse
def write_items_to_csv(repo_path: str, output_path: str, show_commit: bool, value_regex: Union[str, None] = None):
"""Determines the contents of '.properties-MERGED' files and writes to a csv file.
Args:
repo_path (str): The local path to the git repo.
output_path (str): The output path for the csv file.
show_commit (bool): Whether or not to include the commit id in the header
value_regex (Union[str, None]): If non-none, only key value pairs where the value is a regex match with this
value will be included.
"""
row_header = ['Relative path', 'Key', 'Value']
if show_commit:
row_header.append(get_commit_id(repo_path, 'HEAD'))
rows = []
omitted = []
for entry in get_property_file_entries(repo_path):
new_entry = [entry.rel_path, entry.key, entry.value]
if value_regex is None or re.match(value_regex, entry.value):
rows.append(new_entry)
else:
omitted.append(new_entry)
records_to_csv(output_path, [row_header] + rows)
if len(omitted) > 0:
records_to_csv(get_filename_addition(output_path, OMITTED_ADDITION), [row_header] + omitted)
def main():
# noinspection PyTypeChecker
parser = argparse.ArgumentParser(description='Gathers all key-value pairs within .properties-MERGED files into '
'one csv file.',
formatter_class=argparse.ArgumentDefaultsHelpFormatter)
parser.add_argument(dest='output_path', type=str, help='The path to the output csv file. The output path should be'
' specified as a relative path with the dot slash notation '
'(i.e. \'./outputpath.csv\') or an absolute path.')
parser.add_argument('-r', '--repo', dest='repo_path', type=str, required=False,
help='The path to the repo. If not specified, path of script is used.')
parser.add_argument('-nc', '--no_commit', dest='no_commit', action='store_true', default=False,
required=False, help="Suppresses adding commits to the generated csv header.")
args = parser.parse_args()
repo_path = args.repo_path if args.repo_path is not None else get_git_root(get_proj_dir())
output_path = args.output_path
show_commit = not args.no_commit
write_items_to_csv(repo_path, output_path, show_commit)
sys.exit(0)
if __name__ == "__main__":
main()

View File

@ -0,0 +1,51 @@
"""Provides tools for parsing and writing to a csv file.
"""
from typing import List, Iterable, Tuple
import csv
import os
def records_to_csv(output_path: str, rows: Iterable[List[str]]):
"""Writes rows to a csv file at the specified path.
Args:
output_path (str): The path where the csv file will be written.
rows (List[List[str]]): The rows to be written. Each row of a
list of strings will be written according
to their index (i.e. column 3 will be index 2).
"""
parent_dir, file = os.path.split(output_path)
if not os.path.exists(parent_dir):
os.makedirs(parent_dir)
with open(output_path, 'w', encoding="utf-8-sig", newline='') as csvfile:
writer = csv.writer(csvfile)
for row in rows:
writer.writerow(row)
def csv_to_records(input_path: str, header_row: bool) -> Tuple[List[List[str]], List[str]]:
"""Writes rows to a csv file at the specified path.
Args:
input_path (str): The path where the csv file will be written.
header_row (bool): Whether or not there is a header row to be skipped.
"""
with open(input_path, encoding='utf-8-sig') as csv_file:
csv_reader = csv.reader(csv_file, delimiter=',')
header = None
results = []
try:
for row in csv_reader:
if header_row:
header = row
header_row = False
else:
results.append(row)
except Exception as e:
raise Exception("There was an error parsing csv {path}".format(path=input_path), e)
return results, header

View File

@ -0,0 +1,97 @@
"""This script determines the updated, added, and deleted properties from the '.properties-MERGED' files
and generates a csv file containing the items changed. This script requires the python libraries:
gitpython and jproperties. As a consequence, it also requires git >= 1.7.0 and python >= 3.4.
"""
import re
import sys
from envutil import get_proj_dir
from fileutil import get_filename_addition, OMITTED_ADDITION
from gitutil import get_property_files_diff, get_commit_id, get_git_root
from itemchange import ItemChange, ChangeType
from csvutil import records_to_csv
import argparse
from typing import Union
from langpropsutil import get_commit_for_language, LANG_FILENAME
def write_diff_to_csv(repo_path: str, output_path: str, commit_1_id: str, commit_2_id: str, show_commits: bool,
value_regex: Union[str, None] = None):
"""Determines the changes made in '.properties-MERGED' files from one commit to another commit.
Args:
repo_path (str): The local path to the git repo.
output_path (str): The output path for the csv file.
commit_1_id (str): The initial commit for the diff.
commit_2_id (str): The latest commit for the diff.
show_commits (bool): Show commits in the header row.
value_regex (Union[str, None]): If non-none, only key value pairs where the value is a regex match with this
value will be included.
"""
row_header = ItemChange.get_headers()
if show_commits:
row_header += [get_commit_id(repo_path, commit_1_id), get_commit_id(repo_path, commit_2_id)]
rows = []
omitted = []
for entry in get_property_files_diff(repo_path, commit_1_id, commit_2_id):
new_entry = entry.get_row()
if value_regex is not None and (entry.type == ChangeType.DELETION or not re.match(value_regex, entry.cur_val)):
omitted.append(new_entry)
else:
rows.append(new_entry)
records_to_csv(output_path, [row_header] + rows)
if len(omitted) > 0:
records_to_csv(get_filename_addition(output_path, OMITTED_ADDITION), [row_header] + omitted)
def main():
# noinspection PyTypeChecker
parser = argparse.ArgumentParser(description="Determines the updated, added, and deleted properties from the "
"'.properties-MERGED' files and generates a csv file containing "
"the items changed.",
formatter_class=argparse.ArgumentDefaultsHelpFormatter)
parser.add_argument(dest='output_path', type=str, help='The path to the output csv file. The output path should '
'be specified as a relative path with the dot slash notation'
' (i.e. \'./outputpath.csv\') or an absolute path.')
parser.add_argument('-r', '--repo', dest='repo_path', type=str, required=False,
help='The path to the repo. If not specified, path of script is used.')
parser.add_argument('-fc', '--first-commit', dest='commit_1_id', type=str, required=False,
help='The commit for previous release. This flag or the language flag need to be specified'
' in order to determine a start point for the difference.')
parser.add_argument('-lc', '--latest-commit', dest='commit_2_id', type=str, default='HEAD', required=False,
help='The commit for current release.')
parser.add_argument('-nc', '--no-commits', dest='no_commits', action='store_true', default=False,
required=False, help="Suppresses adding commits to the generated csv header.")
parser.add_argument('-l', '--language', dest='language', type=str, default=None, required=False,
help='Specify the language in order to determine the first commit to use (i.e. \'ja\' for '
'Japanese. This flag overrides the first-commit flag.')
args = parser.parse_args()
repo_path = args.repo_path if args.repo_path is not None else get_git_root(get_proj_dir())
output_path = args.output_path
commit_1_id = args.commit_1_id
lang = args.language
if lang is not None:
commit_1_id = get_commit_for_language(lang)
if commit_1_id is None:
print('Either the first commit or language flag need to be specified. If specified, the language file, ' +
LANG_FILENAME + ', may not have the latest commit for the language.', file=sys.stderr)
parser.print_help(sys.stderr)
sys.exit(1)
commit_2_id = args.commit_2_id
show_commits = not args.no_commits
write_diff_to_csv(repo_path, output_path, commit_1_id, commit_2_id, show_commits)
sys.exit(0)
if __name__ == "__main__":
main()

View File

@ -0,0 +1,17 @@
"""Functions relating to the project environment.
"""
import pathlib
from typing import Union
def get_proj_dir(path: Union[pathlib.PurePath, str] = __file__) -> str:
"""
Gets parent directory of this file (and subsequently, the project).
Args:
path: Can be overridden to provide a different file. This will return the parent of that file in that instance.
Returns:
The project folder or the parent folder of the file provided.
"""
return str(pathlib.Path(path).parent.absolute())

View File

@ -0,0 +1,63 @@
import os
from typing import Union, Tuple
from pathlib import Path
def get_path_pieces(orig_path: str) -> Tuple[str, Union[str, None], Union[str, None]]:
"""Retrieves path pieces. This is a naive approach as it determines if a file is present based on the
presence of an extension.
Args:
orig_path: The original path to deconstruct.
Returns: A tuple of directory, filename and extension. If no extension is present, filename and extension are None.
"""
potential_parent_dir, orig_file = os.path.split(str(Path(orig_path)))
filename, file_extension = os.path.splitext(orig_file)
if file_extension.startswith('.'):
file_extension = file_extension[1:]
if file_extension is None or len(file_extension) < 1:
return str(Path(orig_path)), None, None
else:
return potential_parent_dir, filename, file_extension
def get_new_path(orig_path: str, new_filename: str) -> str:
"""Obtains a new path. This tries to determine if the provided path is a directory or filename (has an
extension containing '.') then constructs the new path with the old parent directory and the new filename.
Args:
orig_path (str): The original path.
new_filename (str): The new filename to use.
Returns:
str: The new path.
"""
parent_dir, filename, ext = get_path_pieces(orig_path)
return str(Path(parent_dir) / Path(new_filename))
# For use with creating csv filenames for entries that have been omitted.
OMITTED_ADDITION = '-omitted'
def get_filename_addition(orig_path: str, filename_addition: str) -> str:
"""Gets filename with addition. So if item is '/path/name.ext' and the filename_addition is '-add', the new result
would be '/path/name-add.ext'.
Args:
orig_path (str): The original path.
filename_addition (str): The new addition.
Returns: The altered path.
"""
parent_dir, filename, extension = get_path_pieces(orig_path)
if filename is None:
return str(Path(orig_path + filename_addition))
else:
ext = '' if extension is None else extension
return str(Path(parent_dir) / Path('{0}{1}.{2}'.format(filename, filename_addition, ext)))

Some files were not shown because too many files have changed in this diff Show More