我有一个应用程序,我正在寻找一个文本文件,如果对文件做了任何更改,我使用OnChanged事件处理程序来处理事件。我正在使用NotifyFilters。LastWriteTime,但是事件仍然被触发两次。这是代码。

public void Initialize()
{
   FileSystemWatcher _fileWatcher = new FileSystemWatcher();
  _fileWatcher.Path = "C:\\Folder";
  _fileWatcher.NotifyFilter = NotifyFilters.LastWrite;
  _fileWatcher.Filter = "Version.txt";
  _fileWatcher.Changed += new FileSystemEventHandler(OnChanged);
  _fileWatcher.EnableRaisingEvents = true;
}

private void OnChanged(object source, FileSystemEventArgs e)
{
   .......
}

在我的情况下,OnChanged被调用两次,当我改变文本文件version.txt并保存它。


当前回答

我可以通过添加一个函数来检查缓冲区数组中的重复项来做到这一点。

然后使用计时器在数组X时间内未被修改后执行该操作: -每次有东西写入缓冲区时重置计时器 —对tick执行动作

这还捕获了另一种复制类型。如果修改文件夹内的文件,该文件夹也会引发Change事件。

Function is_duplicate(str1 As String) As Boolean
    If lb_actions_list.Items.Count = 0 Then
        Return False
    Else
        Dim compStr As String = lb_actions_list.Items(lb_actions_list.Items.Count - 1).ToString
        compStr = compStr.Substring(compStr.IndexOf("-") + 1).Trim

        If compStr <> str1 AndAlso compStr.parentDir <> str1 & "\" Then
            Return False
        Else
            Return True
        End If
    End If
End Function

Public Module extentions
<Extension()>
Public Function parentDir(ByVal aString As String) As String
    Return aString.Substring(0, CInt(InStrRev(aString, "\", aString.Length - 1)))
End Function
End Module

其他回答

我的场景是,我有一个虚拟机,其中有一个Linux服务器。我正在Windows主机上开发文件。当我在主机上的一个文件夹中更改某些内容时,我希望所有的更改都被上传,并通过Ftp同步到虚拟服务器上。这是我如何消除重复更改事件当我写入一个文件(这标志着包含文件要修改的文件夹):

private Hashtable fileWriteTime = new Hashtable();

private void fsw_sync_Changed(object source, FileSystemEventArgs e)
{
    string path = e.FullPath.ToString();
    string currentLastWriteTime = File.GetLastWriteTime( e.FullPath ).ToString();

    // if there is no path info stored yet
    // or stored path has different time of write then the one now is inspected
    if ( !fileWriteTime.ContainsKey(path) ||
         fileWriteTime[path].ToString() != currentLastWriteTime
    )
    {
        //then we do the main thing
        log( "A CHANGE has occured with " + path );

        //lastly we update the last write time in the hashtable
        fileWriteTime[path] = currentLastWriteTime;
    }
}

我主要创建一个哈希表来存储文件写时间信息。然后,如果哈希表有被修改的文件路径,并且它的时间值与当前通知的文件更改相同,那么我就知道它是事件的副本,并忽略它。

我知道这是一个老问题,但我遇到了同样的问题,上面的解决方案都没有真正解决我所面临的问题。我已经创建了一个字典,它将文件名与LastWriteTime映射。因此,如果文件不在字典中,将继续执行该进程,否则将检查最后一次修改时间是什么时候,如果与字典中的时间不同,则运行代码。

    Dictionary<string, DateTime> dateTimeDictionary = new Dictionary<string, DateTime>(); 

        private void OnChanged(object source, FileSystemEventArgs e)
            {
                if (!dateTimeDictionary.ContainsKey(e.FullPath) || (dateTimeDictionary.ContainsKey(e.FullPath) && System.IO.File.GetLastWriteTime(e.FullPath) != dateTimeDictionary[e.FullPath]))
                {
                    dateTimeDictionary[e.FullPath] = System.IO.File.GetLastWriteTime(e.FullPath);

                    //your code here
                }
            }

试试这个,效果很好

  private static readonly FileSystemWatcher Watcher = new FileSystemWatcher();
    static void Main(string[] args)
    {
        Console.WriteLine("Watching....");

        Watcher.Path = @"D:\Temp\Watcher";
        Watcher.Changed += OnChanged;
        Watcher.EnableRaisingEvents = true;
        Console.ReadKey();
    }

    static void OnChanged(object sender, FileSystemEventArgs e)
    {
        try
        {
            Watcher.Changed -= OnChanged;
            Watcher.EnableRaisingEvents = false;
            Console.WriteLine($"File Changed. Name: {e.Name}");
        }
        catch (Exception exception)
        {
            Console.WriteLine(exception);
        }
        finally
        {
            Watcher.Changed += OnChanged;
            Watcher.EnableRaisingEvents = true;
        }
    }

事件如果没有问,这是一个遗憾,没有现成的解决方案样本f#。 要解决这个问题,这里有我的方法,因为我可以,而且f#是一种很棒的。net语言。

使用FSharp.Control.Reactive包过滤掉重复的事件,它只是响应式扩展的f#包装器。所有这些都可以针对全框架或netstandard2.0:

let createWatcher path filter () =
    new FileSystemWatcher(
        Path = path,
        Filter = filter,
        EnableRaisingEvents = true,
        SynchronizingObject = null // not needed for console applications
    )

let createSources (fsWatcher: FileSystemWatcher) =
    // use here needed events only. 
    // convert `Error` and `Renamed` events to be merded
    [| fsWatcher.Changed :> IObservable<_>
       fsWatcher.Deleted :> IObservable<_>
       fsWatcher.Created :> IObservable<_>
       //fsWatcher.Renamed |> Observable.map renamedToNeeded
       //fsWatcher.Error   |> Observable.map errorToNeeded
    |] |> Observable.mergeArray

let handle (e: FileSystemEventArgs) =
    printfn "handle %A event '%s' '%s' " e.ChangeType e.Name e.FullPath 

let watch path filter throttleTime =
    // disposes watcher if observer subscription is disposed
    Observable.using (createWatcher path filter) createSources
    // filter out multiple equal events
    |> Observable.distinctUntilChanged
    // filter out multiple Changed
    |> Observable.throttle throttleTime
    |> Observable.subscribe handle

[<EntryPoint>]
let main _args =
    let path = @"C:\Temp\WatchDir"
    let filter = "*.zip"
    let throttleTime = TimeSpan.FromSeconds 10.
    use _subscription = watch path filter throttleTime
    System.Console.ReadKey() |> ignore
    0 // return an integer exit code

以下是我的方法:

// Consider having a List<String> named _changedFiles

private void OnChanged(object source, FileSystemEventArgs e)
{
    lock (_changedFiles)
    {
        if (_changedFiles.Contains(e.FullPath))
        {
            return;
        }
        _changedFiles.Add(e.FullPath);
    }

    // do your stuff

    System.Timers.Timer timer = new Timer(1000) { AutoReset = false };
    timer.Elapsed += (timerElapsedSender, timerElapsedArgs) =>
    {
        lock (_changedFiles)
        {
            _changedFiles.Remove(e.FullPath);
        }
    };
   timer.Start();
}

这是我在一个项目中用来解决这个问题的解决方案,在该项目中,我将文件作为附件发送到邮件中。 它可以很容易地避免两次触发事件,即使是一个较小的定时器间隔,但在我的情况下,1000是可以的,因为我更喜欢错过一些变化,而不是每秒用> 1条消息淹没邮箱。 至少在同时更改多个文件的情况下,它可以正常工作。

Another solution I've thought of would be to replace the list with a dictionary mapping files to their respective MD5, so you wouldn't have to choose an arbitrary interval since you wouldn't have to delete the entry but update its value, and cancel your stuff if it hasn't changed. It has the downside of having a Dictionary growing in memory as files are monitored and eating more and more memory, but I've read somewhere that the amount of files monitored depends on the FSW's internal buffer, so maybe not that critical. Dunno how MD5 computing time would affect your code's performances either, careful =\