Recently, transfer learning for TSC using deep neural networks has been explored, e.g., using RNNs in, and using CNNs in. Time series that affect the final classification decision via occlusion We also provide qualitative insights into the working of CTN by: i)Īnalyzing the activations and filters of first convolution layer suggesting theįilters in CTN are generically useful, ii) analyzing the impact of the designĭecision to incorporate multiple length decisions, and iii) finding regions of Subsequent task-specific fine-tuning compared to existing state-of-the-art TSCĪpproaches. We observe significant gains in classification accuracy as wellĪs computational efficiency when using pre-trained CTN as a starting point for Generalizability and transferability of the learned filters on the remaining 41 Points from the UCR TSC Benchmark for training and testing transferability ofĬTN: We train CTN on a randomly chosen subset of 24 datasets using a multi-headĪpproach with a different softmax layer for each training dataset, and study We consider all 65 datasets with time series of lengths up to 512 To achieve this, we incorporate filters of multiple lengths in allĬonvolutional layers of CTN to capture temporal features at multiple time Model that can generalize to time series of different lengths across datasets. Length of convolutional filters is a key aspect when building a pre-trained Once trained,ĬTN can be easily adapted to new TSC target tasks via a small amount ofįine-tuning using labeled instances from the target tasks. (CTN): an off-the-shelf deep convolutional neural network (CNN) trained onĭiverse univariate time series classification (TSC) source tasks. Print( "Epoch /1000 ".Training deep neural networks often requires careful hyper-parameter tuningĪnd significant computational resources. softmax_cross_entropy_with_logits( logits = logit, labels = one_hot)) Logits = įor logit, one_hot in zip( logits, one_hots): 'Lighting7', 'ToeSegmentation2', 'DiatomSizeReduction', 'Ham', 'SonyAIBORobotSurface', 'TwoLeadECG', 'FacesUCR'] Traindatasets = [ 'Plane', 'Gun_Point', 'ArrowHead', 'WordsSynonyms', 'ToeSegmentation1', 'FISH', 'ShapeletSim', 'ShapesAll', 'SonyAIBORobotSurfaceII', DropoutWrapper( cell, output_keep_prob = 0.5)Įn_outputs, de_outputs, state = customized_rnn_seq2seq(,, cell)ĭe_outputs = tf. GRUCell( num_units = hidden_size)Ĭell = tf. variable_scope( "rnn_" + str( index)):Ĭell = tf. X_train, X_val, y_train, y_val = train_test_split( X_data, y_data, test_size = 0.2, random_state = 1)Įncoder_input = tf. concatenate(( data_train, data_test), axis = 0) loadtxt( datadir + '_TEST', delimiter = ',')ĭATA = np. loadtxt( datadir + '_TRAIN', delimiter = ',')ĭata_test = np. reuse_variables()ĭatadir = '/home/sohee/UCR_TS_Archive_2015' + '/' + dataset + '/' + datasetĭata_train = np. variable_scope( "loop_function", reuse = True): If loop_function is not None and prev is not None: variable_scope( scope or "rnn_decoder"):įor i, inp in enumerate( decoder_inputs): Return customized_rnn_decoder( encoder_outputs, decoder_inputs, enc_state, cell)ĭef customized_rnn_decoder( encoder_outputs, static_rnn( enc_cell, encoder_inputs, dtype = dtype) variable_scope( scope or "basic_rnn_seq2seq"):Įncoder_outputs, enc_state = tf. #=Hyperparameterĭef customized_rnn_seq2seq( encoder_inputs, model_selection import train_test_splitįrom tensorflow. See the License for the specific language governing permissions and limitations under the License. #Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. #you may not use this file except in compliance with the License. #Licensed under the Apache License, Version 2.0 (the "License") #Copyright 2018 UNIST under XAI Project supported by Ministry of Science and ICT, Korea
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |